HVT Scoring Cells with Layers using scoreLayeredHVT

Zubin Dowlaty, Srinivasan Sudarsanam, Somya Shambhawi

2024-02-20

1. Abstract

The HVT package is a collection of R functions to facilitate building topology preserving maps for rich multivariate data analysis. Tending towards a big data preponderance, a large number of rows. A collection of R functions for this typical workflow is organized below:

  1. Data Compression: Vector quantization (VQ), HVQ (hierarchical vector quantization) using means or medians. This step compresses the rows (long data frame) using a compression objective.

  2. Data Projection: Dimension projection of the compressed cells to 1D,2D or Interactive surface plot with the Sammons Non-linear Algorithm. This step creates topology preserving map (also called as embedding) coordinates into the desired output dimension.

  3. Tessellation: Create cells required for object visualization using the Voronoi Tessellation method, package includes heatmap plots for hierarchical Voronoi tessellations (HVT). This step enables data insights, visualization, and interaction with the topology preserving map. Useful for semi-supervised tasks.

  4. Scoring: Scoring new data sets and recording their assignment using the map objects from the above steps, in a sequence of maps if required.

2. Example : HVT with the Torus dataset

2.1.1 Import Dataset from Local

The user can provide an absolute or relative path in the cell below to access the data from his/her computer. User can set import_data_from_local variable to TRUE to upload dataset from local.
Note: For this notebook import_data_from_local has been set to FALSE as we are simulating a dataset in next section.

import_data_from_local = FALSE # expects logical input

  file_name <- " " #enter the file name
  file_path <- " " #enter the path of the file in local.

# Loading the data in the Rstudio environment 
# Please change the path in the code line below to the path location of the .csv file
if(import_data_from_local){
  file_load <- paste0(file_path, file_name)
  dataset_updated <- as.data.frame(fread(file_load))
  if(nrow(dataset_updated) > 0){
    paste0("File ", file_name, " having ", nrow(dataset_updated), " row(s) and ", ncol(dataset_updated), " column(s)",  " imported successfully. ") %>% cat("\n")
    # Round only the numeric columns in dataset
    dataset_updated <- dataset_updated %>% mutate_if(is.numeric, round, digits = 4)
    paste0("Code chunk executed successfully. Below table showing first 10 row(s) of the dataset.") %>% cat("\n")
    # Display imported dataset
    dataset_updated %>% head(10) %>% 
      as.data.frame() %>%
      DT::datatable(options = options, rownames = TRUE)
  }
  
} 

2.1.2 Simulate Dataset

In this section, we will use a simulated dataset. Given below is a simulated dataset called torus that contains 12000 observations and 3 features.

Let us see how to generate data for torus. We are using a library geozoo for this purpose. Geo Zoo (stands for Geometric Zoo) is a compilation of geometric objects ranging from 3 to 10 dimensions. Geo Zoo contains regular or well-known objects, eg cube and sphere, and some abstract objects, e.g. Boy’s surface, Torus and Hyper-Torus.

Here, we load the data and store into a variable dataset_updated.

set.seed(240)
##torus data generation
torus <- geozoo::torus(p = 3,n = 12000)
dataset_updated <- data.frame(torus$points)
colnames(dataset_updated) <- c("x","y","z")


  if(nrow(dataset_updated) > 0){
paste0( "Dataset having ", nrow(dataset_updated), " row(s) and ", ncol(dataset_updated), " column(s)",  "simulated successfully. ") %>% cat("\n")  
    # Round only the numeric columns in dataset
    dataset_updated <- dataset_updated %>% mutate_if(is.numeric, round, digits = 4) 
    paste0("Code chunk executed successfully. The table below is showing first 20 row(s) of the dataset.") %>% cat("\n")
    # Display imported dataset
    dataset_updated %>% head(100) %>% 
      as.data.frame() %>%
      Table(scroll = TRUE, limit = 20)
  }
Dataset having 12000 row(s) and 3 column(s)simulated successfully.  
Code chunk executed successfully. The table below is showing first 20 row(s) of the dataset. 
x y z
-2.6282 0.5656 -0.7253
-1.4179 -0.8903 0.9455
-1.0308 1.1066 -0.8731
1.8847 0.1895 0.9944
-1.9506 -2.2507 0.2071
-1.4824 0.9229 0.9672
1.5755 -1.5162 -0.9824
-1.0046 -1.8170 -0.9971
-1.3507 0.7445 -0.8891
0.7920 -1.3482 -0.8998
2.8680 -0.0943 -0.4939
-2.3787 1.7986 -0.1878
-2.5734 -0.3077 0.8061
-0.4570 -1.6738 0.9643
-1.1740 0.5023 0.6908
1.1497 -1.5540 -0.9978
-0.8428 -0.5436 0.0755
2.7471 -0.9987 -0.3848
-2.4446 -1.6528 0.3097
-2.6487 -0.5745 0.7040

3. Data Understanding

3.1 Quick Peek of the Data

Summary of Dataset

The table below shows summary of all (numeric & categorical) columns of Dataset

calculate_statistics <- function(column_data) {
     if (is.numeric(column_data)) {
         a <- min(column_data, na.rm = TRUE)
         b <- as.numeric(quantile(column_data, probs = 0.25, na.rm = TRUE)[1])
         c <- median(column_data, na.rm = TRUE)
         d <- mean(column_data, na.rm = TRUE)
         e <- sd(column_data, na.rm = TRUE)
         f <- as.numeric(quantile(column_data, probs = 0.75, na.rm = TRUE)[1])
         g <- max(column_data, na.rm = TRUE)
         
         # Combine the statistics into a data frame and set row names to an empty string
         stats_data <- data.frame(Min = a, Q1 = b, Median = c, Mean = d, sd = e, Q3 = f, Max = g)
         row.names(stats_data) <- ""
     } else {
         cat("Column is not numeric and was skipped.\n")
         return(NULL)
     }
     return(stats_data)
 }

# Apply the function to each column of dataset_updated_train
statistics_list <- lapply(dataset_updated, calculate_statistics)

# Print the result
print(statistics_list)
$x
     Min        Q1 Median         Mean       sd       Q3    Max
 -2.9977 -1.149025 -0.007 -0.001443592 1.505965 1.140325 2.9995

$y
     Min        Q1  Median       Mean       sd       Q3    Max
 -2.9993 -1.113325 0.01305 0.01034638 1.485554 1.133725 2.9993

$z
 Min       Q1 Median       Mean        sd      Q3 Max
  -1 -0.71195 0.0153 0.00442255 0.7117983 0.71855   1

Structure of Data

In the below section we can see the structure of the data.

dataset_updated %>% str()
'data.frame':   12000 obs. of  3 variables:
 $ x: num  -2.63 -1.42 -1.03 1.88 -1.95 ...
 $ y: num  0.566 -0.89 1.107 0.19 -2.251 ...
 $ z: num  -0.725 0.946 -0.873 0.994 0.207 ...

3.2 Deleting Irrelevant Columns

The cell below will allow user to drop irrelevant column.

########################################################################################
################################## User Input Needed ###################################
########################################################################################

# Add column names which you want to remove
want_to_delete_column <- "no"

    del_col<-c(" `column_name` ")  

if(want_to_delete_column == "yes"){
   dataset_updated <-  dataset_updated[ , !(names(dataset_updated) %in% del_col)]
  print("Code chunk executed successfully. Overview of data types after removed selected columns")
  str( dataset_updated)
}else{
  paste0("No Columns removed. Please enter column name if you want to remove that column") %>% cat("\n")
}
No Columns removed. Please enter column name if you want to remove that column 

3.3 Formatting and Renaming Columns

The code below contains a user defined function to rename or reformat any column that the user chooses.

########################################################################################
################################## User Input Needed ###################################
########################################################################################

# convert the column names to lower case
colnames( dataset_updated) <- colnames( dataset_updated) %>% casefold()

## rename column ?
want_to_rename_column <- "no" ## type "yes" if you want to rename a column

## renaming a column of a dataset 
rename_col_name <- " 'column_name` " ## use small letters
rename_col_name_to <- " `new_name` "

if(want_to_rename_column == "yes"){
  names( dataset_updated)[names( dataset_updated) == rename_col_name] <- rename_col_name_to
}

# remove space, comma, dot from column names
spaceless <- function(x) {colnames(x) <- gsub(pattern = "[^[:alnum:]]+",
                             replacement = ".",
                             names(x));x}
 dataset_updated <- spaceless( dataset_updated)

## below is the dataset summary
paste0("Successfully converted the column names to lower case and check the renamed column name if you changed") %>% cat("\n")
Successfully converted the column names to lower case and check the renamed column name if you changed 
str( dataset_updated) ## showing summary for updated 
'data.frame':   12000 obs. of  3 variables:
 $ x: num  -2.63 -1.42 -1.03 1.88 -1.95 ...
 $ y: num  0.566 -0.89 1.107 0.19 -2.251 ...
 $ z: num  -0.725 0.946 -0.873 0.994 0.207 ...

3.4 Changing Data Type of Columns

The section allows the user to change the data type of columns of his/her choice.

########################################################################################
################################## User Input Needed ###################################
########################################################################################

# If you want to change column type, change a below variable value to "yes"
want_to_change_column_type <- "no"

# you can change column type into numeric or character only
change_column_to_type <- "character" ## numeric

if(want_to_change_column_type == "yes" && change_column_to_type == "character"){
########################################################################################
################################## User Input Needed ###################################
########################################################################################
  select_columns <- c("panel_var") ###### Add column names you want to change here #####
   dataset_updated[select_columns]<- sapply( dataset_updated[select_columns],as.character)
  paste0("Code chunk executed successfully. Datatype of selected column(s) have been changed into numerical.")
  #str( dataset_updated)
}else if(want_to_change_column_type == "yes" && change_column_to_type == "numeric"){
  select_columns <- c('gearbox_oil_temperature')
   dataset_updated[select_columns]<- sapply( dataset_updated[select_columns],as.numeric)
  paste0("Code chunk executed successfully. Datatype of selected column(s) have been changed into categorical.")
  #str( dataset_updated)
}else{
  paste0("Datatype of columns have not been changed.") %>% cat("\n")
}
Datatype of columns have not been changed. 
dataset_updated <- do.call(data.frame, dataset_updated)
str( dataset_updated)
'data.frame':   12000 obs. of  3 variables:
 $ x: num  -2.63 -1.42 -1.03 1.88 -1.95 ...
 $ y: num  0.566 -0.89 1.107 0.19 -2.251 ...
 $ z: num  -0.725 0.946 -0.873 0.994 0.207 ...

3.5 Checking and Removing Duplicates

Presence of duplicate observations can be misleading, this sections helps get rid of such rows in the dataset.

want_to_remove_duplicates <- "yes"  ## type "no" for choosing to not remove duplicates

## removing duplicate observation if present in the dataset
if(want_to_remove_duplicates == "yes"){
  
   dataset_updated <-  dataset_updated %>% unique()
  paste0("Code chunk executed successfully, duplicates if present successfully removed. Updated dataset has ", nrow( dataset_updated), " row(s) and ", ncol( dataset_updated), " column(s)") %>% print()
  cat("\n")
  str( dataset_updated) ## showing summary for updated dataset
} else{
  paste0("Code chunk executed successfully, NO duplicates were removed") %>% print()
}
[1] "Code chunk executed successfully, duplicates if present successfully removed. Updated dataset has 12000 row(s) and 3 column(s)"

'data.frame':   12000 obs. of  3 variables:
 $ x: num  -2.63 -1.42 -1.03 1.88 -1.95 ...
 $ y: num  0.566 -0.89 1.107 0.19 -2.251 ...
 $ z: num  -0.725 0.946 -0.873 0.994 0.207 ...

3.6 List of Numerical and Categorical Column Names

# Return the column type 
CheckColumnType <- function(dataVector) {
  #Check if the column type is "numeric" or "character" & decide type accordingly
  if (class(dataVector) == "integer" || class(dataVector) == "numeric") {
    columnType <- "numeric"
  } else { columnType <- "character" }
  #Return the result
  return(columnType)
}
### Loading the list of numeric columns in variable
numeric_cols <<- colnames( dataset_updated)[unlist(sapply( dataset_updated, 
                                                       FUN = function(x){ CheckColumnType(x) == "numeric"}))]

### Loading the list of categorical columns in variable
cat_cols <- colnames( dataset_updated)[unlist(sapply( dataset_updated, 
                                                   FUN = function(x){ 
                                                     CheckColumnType(x) == "character"|| CheckColumnType(x) == "factor"}))]

### Removing Date Column from the list of categorical column
paste0("Code chunk executed successfully, list of numeric and categorical variables created.") %>% cat()
Code chunk executed successfully, list of numeric and categorical variables created.
paste0("\n\n Numerical Column(s): \n Count : ", length(numeric_cols), "\n") %>% cat()


 Numerical Column(s): 
 Count : 3
paste0(numeric_cols) %>% print()
[1] "x" "y" "z"
paste0("\n Categorical Column(s): \n Count : ", length(cat_cols), "\n") %>% cat()

 Categorical Column(s): 
 Count : 0
paste0(cat_cols) %>% print()
character(0)

3.7 Filtering Dataset for Analysis

In this section, the dataset can be filtered for required row(s) for further analysis.

want_to_filter_dataset <- "no" ## type "yes" in case you want to filter
filter_col <- " "  ## Enter Column name to filter
filter_val <- " "        ## Enter Value to exclude for the column selected

if(want_to_filter_dataset == "yes"){
   dataset_updated <- filter_at( dataset_updated
                              , vars(contains(filter_col))
                              , all_vars(. != filter_val))
  
  paste0("Code chunk executed successfully, dataset filtered successfully on required columns. Updated dataset has ", nrow( dataset_updated), " row(s) and ", ncol( dataset_updated), " column(s)") %>% print()
  cat("\n")
  str( dataset_updated) ## showing summary for updated dataset
  
} else{
  paste0("Code chunk executed successfully, entire dataset is available for analysis.") %>% print()
}
[1] "Code chunk executed successfully, entire dataset is available for analysis."

3.8 Missing Value Analysis

Missing values in the training data can lead to a biased model because we have not analyzed the behavior and relationship of those values with other variables correctly. It can lead to a wrong calculation or classification. Missing values can be of 3 types:

Missing Value on Entire dataset

na_total <- sum(is.na( dataset_updated))/prod(dim( dataset_updated))
if(na_total == 0){
  paste0("In the uploaded dataset, there is no missing value") %>% cat("\n")
}else{
  na_percentage <- paste0(sprintf(na_total*100, fmt = '%#.2f'),"%")
  paste0("Percentage of missing value in entire dataset is ",na_percentage) %>% cat("\n")
}
In the uploaded dataset, there is no missing value 

Missing Value on Column-level

The following code is to visualize the missing values (if any) using bar chart.

gg_miss_upset function are using to visualize the patterns of missingness, or rather the combinations of missingness across cases.

This function gives us(if any missing value present):

# Below code gives you missing value in each column
paste0("Number of missing value in each column") %>% cat("\n")
Number of missing value in each column 
print(sapply( dataset_updated, function(x) sum(is.na(x))))
x y z 
0 0 0 
missing_col_names <- names(which(sapply( dataset_updated, anyNA)))

total_na <- sum(is.na( dataset_updated))
# visualize the missing values (if any) using bar chart
if(total_na > 0 && length(missing_col_names) > 1){
  paste0("Code chunk executed successfully. Visualizing the missing values using bar chart") %>% cat("\n")
  gg_miss_upset( dataset_updated,
  nsets = 10,
  nintersects = NA)
}else if(total_na > 0){
   dataset_updated %>%
  DataExplorer::plot_missing() 
  # paste0("Code chunk executed successfully. Only one column ",missing_col_names," have missing values ", sum(is.na( dataset_updated)),".") %>% cat("\n")
}else{
  paste("Code chunk executed successfully. No missing value exist.") %>% cat("\n")
}
Code chunk executed successfully. No missing value exist. 

Missing Value Treatment

In this section user can make decisions, how to tackle missing values in dataset. Both column(s) and row(s) can be removed in the following dataset based on the user choose to do so.

Drop Column(s) with Missing Values

The below code accepts user input and deletes the specified column.

########################################################################################
################################## User Input Needed ###################################
########################################################################################

# OR do you want to drop column specific column
drop_cloumn_name_na <- "yes" ## type "yes" to drop column(s)
# write column name that you want to drop
drop_column_name <- c(" ") #enter column name
if(drop_cloumn_name_na == "yes"){
  names_df=names( dataset_updated) %in% drop_column_name
  dataset_updated <-  dataset_updated[ , which(!names( dataset_updated) %in% drop_column_name)]
  paste0("Code chunk executed, selected column(s) dropped successfully.") %>% print()
  cat("\n")
  str( dataset_updated)
} else {
  paste0("Code chunk executed, missing value not removed (if any).") %>% cat("\n")
  cat("\n")
}
[1] "Code chunk executed, selected column(s) dropped successfully."

'data.frame':   12000 obs. of  3 variables:
 $ x: num  -2.63 -1.42 -1.03 1.88 -1.95 ...
 $ y: num  0.566 -0.89 1.107 0.19 -2.251 ...
 $ z: num  -0.725 0.946 -0.873 0.994 0.207 ...

Drop Row(s) with Missing Values

The below code accepts user input and deletes rows.

# Do you want to drop row(s) containing "NA"
drop_row <- "no" ## type "yes" to delete missing value observations
if(drop_row == "yes"){
  
  # imputing blank with NAs and removing all rows containing NAs
  #  dataset_updated[ dataset_updated == ""] <- NA
  # removing missing values from data
   dataset_updated <-  dataset_updated %>% na.omit()
  
  paste0("Code chunk executed, missing values successfully identified and removed. Updated dataset has ", nrow( dataset_updated), " row(s) and ", ncol( dataset_updated), " column(s)") %>% print()
  cat("\n")
  # str( dataset_updated)
  
} else{
  paste0("Code chunk executed, missing value(s) not removed (if any).") %>% cat("\n")
  cat("\n")
}
Code chunk executed, missing value(s) not removed (if any). 

3.8.1 One-Hot Encoding

This technique bins all categorical values as either 1 or 0. It is used for categorical variables with 2 classes. This is done because classification models can only handle features that have numeric values.

Given below is the length of unique values in each categorical column

cat_cols <-
  colnames(dataset_updated)[unlist(sapply(
    dataset_updated,
    FUN = function(x) {
      CheckColumnType(x) == "character" ||
        CheckColumnType(x) == "factor"
    }
  ))]

apply(dataset_updated[cat_cols], 2, function(x) {
  length(unique(x))
})
integer(0)

Selecting categorical columns with smaller unique values for dummification

########################################################################################
################################## User Input Needed ###################################
########################################################################################
# Do you want to dummify the categorical variables?

dummify_cat <- FALSE ## TRUE,FALSE

# Select the columns on which dummification is to be performed
dum_cols <- c("location.type","class")
########################################################################################
[1] "One-Hot Encoding was not performed on dataset."

3.8.2 Check for Singularity

# Check data for singularity
singular_cols <- sapply(dataset_updated,function(x) length(unique(x))) %>%  # convert to dataframe
  data.frame(Unique_n = .) %>% dplyr::filter(Unique_n == 1) %>% 
  rownames() %>% data.frame(Constant_Variables = .)

if(nrow(singular_cols) != 0) {                              
  singular_cols  %>% DT::datatable()
} else {
  paste("There are no singular columns in the dataset") %>% htmltools::HTML()
}
There are no singular columns in the dataset
# Display variance of columns
data <- dataset_updated %>% dplyr::summarise_if(is.numeric, var) %>% t() %>% 
  data.frame() %>% round(3) #%>% DT::datatable(colnames = "Variance")

colnames(data) <- c("Variance")
Table(data,scroll = FALSE)
Variance
x 2.268
y 2.207
z 0.507

3.8.3 Selecting only Numeric Cols after Dummification

numeric_cols=as.vector(sapply(dataset_updated, is.numeric))
dataset_updated=dataset_updated[,numeric_cols]
colnames(dataset_updated)
[1] "x" "y" "z"

3.9 Final Dataset Summary

All further operations will be performed on the following dataset.

nums <- colnames(dataset_updated)[unlist(lapply(dataset_updated, is.numeric))]
cat(paste0("Final data frame contains ", nrow( dataset_updated), " row(s) and ", ncol( dataset_updated), " column(s).","Code chunk executed. Below table showing first 10 row(s) of the dataset."))
Final data frame contains 12000 row(s) and 3 column(s).Code chunk executed. Below table showing first 10 row(s) of the dataset.
dataset_updated <-  dataset_updated %>% mutate_if(is.numeric, round, digits = 4)

dataset_updated %>% head(10) %>%
  as.data.frame() %>%
  Table(scroll = FALSE)
x y z
-2.6282 0.5656 -0.7253
-1.4179 -0.8903 0.9455
-1.0308 1.1066 -0.8731
1.8847 0.1895 0.9944
-1.9506 -2.2507 0.2071
-1.4824 0.9229 0.9672
1.5755 -1.5162 -0.9824
-1.0046 -1.8170 -0.9971
-1.3507 0.7445 -0.8891
0.7920 -1.3482 -0.8998
      DT::datatable(
        dataset_updated %>%
          select_if(., is.numeric) %>%
          skimr::skim() %>%
          mutate_if(is.numeric, round, digits = 4) %>%
          rename_at(.vars = vars(starts_with("skim_")), .funs = funs(sub("skim_", "", .))) %>%
          rename_at(.vars = vars(starts_with("numeric.")), .funs = funs(sub("numeric.", "", .))) %>%
          select(-c(type, n_missing, complete_rate)) %>%
          mutate(n_row = nrow(dataset_updated),
                 n_missing = rowSums(is.na(.))
                 # ,n_non_missing = n_row - n_missing
                 ) ,
        selection = "none",
        # filter = "top",
        class = 'cell-border stripe',
        escape = FALSE,
        options = options,
        callback = htmlwidgets::JS(
          "var tips = ['Index showing column number',
                        'Columns used for building the HVT model',
                        'Histogram for individual column',
                        'Number of records for each feature',
                        'Number of missing (NA) values for each feature',
                        'Mean of individual column',
                        'Standard deviation of individual column',
                        '0th Percentile means that the values are smaller than all 100% of the rows',
                        '25th Percentile means that the values are bigger than 25% and smaller than only 75% of the rows',
                        '50th Percentile means that the values are bigger than 50% and smaller than only 50% of the rows',
                        '75th Percentile means that the values are bigger than 75% and smaller than only 25% of the rows',
                        '100th Percentile means that the values are bigger than 100% of the rows'],
                            header = table.columns().header();
                        for (var i = 0; i < tips.length; i++) {
                          $(header[i]).attr('title', tips[i]);
                        }"
        )
      )
#print(
 aa <-  dataset_updated %>%
    select_if(., is.numeric) %>%
    skimr::skim() %>%
    mutate_if(is.numeric, round, digits = 4) %>%
    rename_at(.vars = vars(starts_with("skim_")), .funs = funs(sub("skim_", "", .))) %>%
    rename_at(.vars = vars(starts_with("numeric.")), .funs = funs(sub("numeric.", "", .))) %>%
    select(-c(type, n_missing, complete_rate)) %>%
    mutate(n_row = nrow(dataset_updated),
           n_missing = rowSums(is.na(.))
           # ,n_non_missing = n_row - n_missing
           )
Table(aa,scroll = TRUE, limit = 20)
variable mean sd p0 p25 p50 p75 p100 hist n_row n_missing
x -0.0014 1.5060 -2.9977 -1.1490 -0.0070 1.1403 2.9995 ▅▇▇▇▅ 12000 0
y 0.0103 1.4856 -2.9993 -1.1133 0.0130 1.1337 2.9993 ▃▇▇▇▅ 12000 0
z 0.0044 0.7118 -1.0000 -0.7120 0.0153 0.7186 1.0000 ▇▃▃▃▇ 12000 0

4. Data distribution

Variable Histograms

Shown below is the distribution of all the variables in the dataset.

eda_cols <- names(dataset_updated)
# Here we plot the distribution of columns selected by user for numerical transformation
dist_list <- lapply(1:length(eda_cols), function(i){
generateDistributionPlot(dataset_updated, eda_cols[i]) })
do.call(gridExtra::grid.arrange, args = list(grobs = dist_list, ncol = 2, top = "Distribution of Features"))

Box Plots

In this section, we plot box plots for each numeric column in the dataset across panels. These plots will display the median and Inter Quartile Range of each column at a panel level.

## the below function helps plotting quantile outlier plot for multiple variables
quantile_outlier_plots_fn <- function(data, outlier_check_var, data_cat = dataset_updated[, cat_cols], numeric_cols = numeric_cols){
    # lower threshold
    lower_threshold <- stats::quantile(data[, outlier_check_var], .25,na.rm = TRUE) - 1.5*(stats::IQR(data[, outlier_check_var], na.rm = TRUE))
    
    # upper threshold
    upper_threshold <- stats::quantile(data[,outlier_check_var],.75,na.rm = TRUE) + 1.5*(stats::IQR(data[,outlier_check_var],na.rm = TRUE))
    
    # Look for outliers based on thresholds
    data$QuantileOutlier <- data[,outlier_check_var] > upper_threshold | data[,outlier_check_var] < lower_threshold

  # Plot box plot
  quantile_outlier_plot <- ggplot2::ggplot(data, ggplot2::aes(x="", y = data[,outlier_check_var])) +
             ggplot2::geom_boxplot(fill = 'blue',alpha=0.7) + 
             ggplot2::theme_bw() + 
             ggplot2::theme(panel.border=ggplot2::element_rect(size=0.1),panel.grid.minor.x=ggplot2::element_blank(),panel.grid.major.x=ggplot2::element_blank(),legend.position = "bottom") + ggplot2::ylab(outlier_check_var) + ggplot2::xlab("")
  data <- cbind(data[, !names(data) %in% c("QuantileOutlier")] %>% round(2), outlier = data[, c("QuantileOutlier")])
  data <- cbind(data, data_cat)  
  return(list(quantile_outlier_plot, data, lower_threshold, upper_threshold))
}
## the below code gives the interactive plot for Quantile Outlier analysis for numerical variables 
box_plots <- list()
for (x in names(dataset_updated)) {

box_plots[[x]] <- quantile_outlier_plots_fn(data = dataset_updated, outlier_check_var = x)[[1]]

}

gridExtra::grid.arrange(grobs = box_plots, ncol = 3)

Correlation Matrix

In this section we are calculating pearson correlation which is a bivariate correlation value measuring the linear correlation between two numeric columns. The output shown is a matrix.

4.1 Train - Test Split

Let us first split the data into train and test. We will use 80% of the data as train and remaining as test.

## 80% of the sample size
smp_size <- floor(0.80 * nrow(dataset_updated))

## set the seed to make your partition reproducible
set.seed(240)
train_ind <- sample(seq_len(nrow(dataset_updated)), size = smp_size)

dataset_updated_train <- dataset_updated[train_ind, ]
dataset_updated_test <- dataset_updated[-train_ind, ]

The train data contains 9600 rows and 3 columns. The test data contains 2400 rows and 3 columns.

4.1.1 Train Distribution

eda_cols <- names(dataset_updated_train)
# Here we plot the distribution of columns selected by user for numerical transformation
dist_list <- lapply(1:length(eda_cols), function(i){
generateDistributionPlot(dataset_updated_train, eda_cols[i]) })
do.call(gridExtra::grid.arrange, args = list(grobs = dist_list, ncol = 2, top = "Distribution of Features"))

4.1.2 Test Distribution

eda_cols <- names(dataset_updated_test)
# Here we plot the distribution of columns selected by user for numerical transformation
dist_list <- lapply(1:length(eda_cols), function(i){
generateDistributionPlot(dataset_updated_test, eda_cols[i]) })
do.call(gridExtra::grid.arrange, args = list(grobs = dist_list, ncol = 2, top = "Distribution of Features"))

5. Map A : Base Compressed Map

Let us try to visualize the compressed Map A from the flow diagram below.

Figure 1: Data Segregation with highlighted bounding box in red around compressed map A

Figure 1: Data Segregation with highlighted bounding box in red around compressed map A

This package can perform vector quantization using the following algorithms -

For more information on vector quantization, refer the following link.

The trainHVT function constructs highly compressed hierarchical Voronoi tessellations. The raw data is first scaled and this scaled data is supplied as input to the vector quantization algorithm. The vector quantization algorithm compresses the dataset until a user-defined compression percentage/rate is achieved using a parameter called quantization error which acts as a threshold and determines the compression percentage. It means that for a given user-defined compression percentage we get the ‘n’ number of cells, then all of these cells formed will have a quantization error less than the threshold quantization error.

Let’s try to comprehend the trainHVT function first before moving ahead.

trainHVT(
  dataset,
  min_compression_perc,
  n_cells,
  depth,
  quant.err,
  distance_metric = c("L1_Norm", "L2_Norm"),
  error_metric = c("mean", "max"),
  quant_method = c("kmeans", "kmedoids"),
  normalize = TRUE,
  diagnose = FALSE,
  hvt_validation = FALSE,
  train_validation_split_ratio = 0.8
)

Each of the parameters of trainHVT function have been explained below:

The output of trainHVT function (list of 6 elements) have been explained below:

We will use the trainHVT function to compress our data while preserving essential features of the dataset. Our goal is to achieve data compression upto atleast 80%. In situations where the compression ratio does not meet the desired target, we can explore adjusting the model parameters as a potential solution. This involves making modifications to parameters such as the quantization error threshold or increasing the number of cells and then rerunning the trainHVT function again.

As this is already done in HVT Vignette: please refer for more information.

Model Parameters

set.seed(240)
torus_mapA <- trainHVT(
  dataset_updated_train,
  n_cells = 900,
  depth = 1,
  quant.err = 0.1,
  projection.scale = 10,
  normalize = FALSE,
  distance_metric = "L1_Norm",
  error_metric = "max",
  quant_method = "kmeans"
)

Let’s check the compression summary for torus.

compressionSummaryTable(torus_mapA[[3]]$compression_summary)
segmentLevel noOfCells noOfCellsBelowQuantizationError percentOfCellsBelowQuantizationErrorThreshold parameters
1 900 767 0.85 n_cells: 900 quant.err: 0.1 distance_metric: L1_Norm error_metric: max quant_method: kmeans

We successfully compressed 85% of the data using n_cells parameter as 900, the next step involves performing data projection on the compressed data. In this step, the compressed data will be transformed and projected onto a lower-dimensional space to visualize and analyze the data in a more manageable form.

As per the manual, torus_mapA[[3]] gives us detailed information about the hierarchical vector quantized data. torus_mapA[[3]][['summary']] gives a nice tabular data containing no of points, Quantization Error and the codebook.

The datatable displayed below is the summary from torus_mapA showing Cell.ID, Centroids and Quantization Error for each of the 900 cells.

summaryTable(torus_mapA[[3]]$summary,scroll = TRUE,limit = 500)
Segment.Level Segment.Parent Segment.Child n Cell.ID Quant.Error x y z
1 1 1 10 767 0.09 2.19 0.34 0.97
1 1 2 13 54 0.1 -2.13 1.48 -0.80
1 1 3 8 528 0.06 1.24 2.40 0.71
1 1 4 15 350 0.1 0.48 2.52 -0.82
1 1 5 7 662 0.05 0.08 -1.92 1.00
1 1 6 5 248 0.05 -0.32 1.77 0.98
1 1 7 12 864 0.08 1.94 -1.90 -0.70
1 1 8 4 873 0.06 2.92 -0.62 -0.17
1 1 9 8 459 0.05 -0.42 -0.92 -0.16
1 1 10 4 272 0.04 -1.19 0.05 -0.58
1 1 11 9 830 0.09 2.70 0.13 0.70
1 1 12 6 313 0.04 -1.10 -0.14 -0.45
1 1 13 8 801 0.11 2.70 0.73 0.60
1 1 14 11 618 0.06 0.89 -0.61 -0.38
1 1 15 22 601 0.08 0.99 -0.20 0.16
1 1 16 8 581 0.07 1.45 2.29 0.70
1 1 17 15 523 0.07 -0.08 -1.18 -0.58
1 1 18 9 428 0.05 0.27 0.98 -0.18
1 1 19 11 227 0.05 -1.01 0.99 0.81
1 1 20 9 108 0.08 -1.38 1.88 -0.93
1 1 21 14 420 0.05 -0.63 -0.78 -0.06
1 1 22 12 613 0.06 0.74 -0.83 -0.45
1 1 23 12 195 0.07 -0.66 1.84 1.00
1 1 24 8 855 0.07 1.73 -1.92 0.81
1 1 25 8 196 0.06 -1.23 1.00 0.91
1 1 26 16 567 0.08 0.42 -0.91 0.10
1 1 27 21 160 0.13 -0.42 2.54 0.81
1 1 28 10 497 0.08 -0.85 -2.53 -0.74
1 1 29 5 860 0.07 1.50 -2.27 0.69
1 1 30 10 625 0.06 1.23 0.27 0.68
1 1 31 12 394 0.06 0.11 1.16 0.56
1 1 32 10 484 0.05 -0.25 -0.97 -0.08
1 1 33 19 594 0.08 0.34 -1.29 -0.74
1 1 34 7 721 0.07 1.11 -1.31 -0.96
1 1 35 5 679 0.09 1.58 0.55 0.95
1 1 36 9 825 0.07 2.26 -0.91 -0.90
1 1 37 12 444 0.1 -0.81 -1.57 -0.97
1 1 38 8 466 0.04 0.75 1.66 0.98
1 1 39 10 270 0.08 -0.97 0.57 -0.48
1 1 40 9 851 0.13 1.12 -2.67 0.44
1 1 41 8 50 0.08 -1.75 2.05 -0.71
1 1 42 9 398 0.06 -0.75 -0.67 -0.07
1 1 43 8 846 0.1 2.21 -1.15 0.87
1 1 44 5 7 0.09 -1.94 2.29 0.05
1 1 45 5 40 0.06 -2.48 1.16 0.67
1 1 46 7 520 0.05 -0.03 -1.07 0.36
1 1 47 11 273 0.05 -1.12 0.20 -0.51
1 1 48 10 596 0.06 1.52 1.64 -0.97
1 1 49 10 491 0.07 -0.30 -1.16 0.59
1 1 50 14 777 0.07 1.14 -1.74 0.99
1 1 51 8 136 0.08 -0.61 2.63 -0.71
1 1 52 17 116 0.09 -1.54 1.55 -0.98
1 1 53 5 390 0.03 -0.79 -0.66 -0.24
1 1 54 17 191 0.08 -0.96 1.49 -0.97
1 1 55 12 441 0.11 -1.20 -2.63 0.43
1 1 56 16 11 0.11 -2.65 1.40 0.00
1 1 57 8 102 0.12 -0.75 2.83 -0.35
1 1 58 13 577 0.08 1.10 0.52 0.62
1 1 59 8 546 0.06 -0.14 -1.54 -0.89
1 1 60 7 66 0.06 -2.72 0.13 -0.69
1 1 61 10 722 0.09 2.12 1.57 0.77
1 1 62 8 744 0.07 1.79 -0.16 0.98
1 1 63 7 254 0.05 -1.14 0.47 -0.65
1 1 64 15 473 0.12 1.06 2.78 -0.20
1 1 65 9 781 0.13 0.49 -2.63 0.73
1 1 66 13 652 0.08 0.76 -1.06 0.72
1 1 67 11 492 0.08 -0.40 -1.42 -0.85
1 1 68 6 294 0.04 -1.10 0.05 -0.43
1 1 69 9 783 0.09 2.47 0.65 0.83
1 1 70 8 55 0.07 -2.30 1.22 -0.79
1 1 71 12 764 0.09 1.02 -1.91 -0.98
1 1 72 13 462 0.07 0.75 1.62 -0.98
1 1 73 9 848 0.1 2.04 -1.56 -0.82
1 1 74 14 643 0.11 -0.35 -2.48 0.86
1 1 75 5 1 0.07 -2.32 1.90 0.04
1 1 76 10 230 0.07 -0.80 1.32 -0.89
1 1 77 10 701 0.11 -0.24 -2.90 -0.40
1 1 78 10 521 0.07 -0.19 -1.35 -0.76
1 1 79 17 433 0.08 -0.94 -1.66 0.99
1 1 80 6 844 0.08 2.66 -0.51 -0.70
1 1 81 5 863 0.03 2.44 -1.10 0.73
1 1 82 9 541 0.09 1.32 2.01 -0.91
1 1 83 14 488 0.07 -0.24 -1.03 -0.34
1 1 84 14 370 0.06 -1.25 -1.56 -1.00
1 1 85 21 543 0.08 0.91 0.51 0.29
1 1 86 6 452 0.06 0.51 1.15 -0.67
1 1 87 9 280 0.07 -0.46 1.28 0.77
1 1 88 7 604 0.08 1.54 1.77 0.93
1 1 89 8 501 0.07 -0.12 -1.00 -0.15
1 1 90 10 104 0.11 -2.70 -1.09 0.39
1 1 91 9 349 0.06 -1.59 -2.49 -0.30
1 1 92 16 598 0.08 1.35 0.78 -0.89
1 1 93 6 68 0.06 -2.64 0.27 -0.75
1 1 94 10 432 0.09 -1.01 -1.86 0.99
1 1 95 8 842 0.07 2.81 -0.11 -0.57
1 1 96 21 588 0.1 1.16 0.41 -0.63
1 1 97 12 24 0.08 -1.85 2.23 -0.43
1 1 98 5 660 0.04 1.72 2.45 0.12
1 1 99 15 498 0.08 0.99 1.58 -0.99
1 1 100 16 184 0.12 -2.14 -1.59 -0.73
1 1 101 8 38 0.09 -1.97 1.87 -0.69
1 1 102 12 715 0.08 1.96 0.87 0.99
1 1 103 11 816 0.11 2.76 0.47 -0.59
1 1 104 9 236 0.05 -1.38 0.15 -0.79
1 1 105 10 619 0.07 0.58 -1.13 -0.68
1 1 106 12 540 0.05 0.21 -0.98 -0.04
1 1 107 8 638 0.09 1.34 0.25 -0.77
1 1 108 11 255 0.06 -1.33 0.01 -0.74
1 1 109 12 314 0.08 -1.40 -0.92 -0.94
1 1 110 12 729 0.07 1.76 -0.13 -0.97
1 1 111 12 593 0.07 0.64 -0.80 0.23
1 1 112 12 880 0.09 2.47 -1.43 -0.50
1 1 113 7 185 0.05 -1.52 0.61 -0.93
1 1 114 10 438 0.04 0.37 1.03 -0.42
1 1 115 7 410 0.06 0.15 1.09 -0.44
1 1 116 14 286 0.04 -1.05 0.24 -0.37
1 1 117 8 316 0.05 -1.03 -0.02 0.25
1 1 118 10 474 0.07 -0.44 -1.18 -0.67
1 1 119 12 489 0.09 1.05 2.09 -0.94
1 1 120 6 674 0.04 1.25 -0.48 -0.75
1 1 121 7 379 0.08 -1.08 -1.14 -0.90
1 1 122 11 425 0.05 -0.61 -0.82 -0.22
1 1 123 6 186 0.06 -1.63 0.19 0.93
1 1 124 10 805 0.07 1.77 -1.39 -0.97
1 1 125 13 882 0.1 2.24 -1.70 -0.57
1 1 126 18 48 0.11 -1.33 2.63 0.32
1 1 127 7 833 0.09 1.49 -1.94 0.89
1 1 128 11 353 0.1 -1.53 -2.34 0.60
1 1 129 14 400 0.1 -1.29 -2.21 0.82
1 1 130 11 324 0.07 -1.49 -1.42 -1.00
1 1 131 3 123 0.03 -1.68 1.23 1.00
1 1 132 5 71 0.07 -2.78 -0.12 -0.62
1 1 133 4 259 0.07 -1.77 -2.24 -0.52
1 1 134 10 443 0.07 0.37 0.95 0.16
1 1 135 12 896 0.11 2.31 -1.83 -0.30
1 1 136 13 58 0.13 -2.83 -0.06 0.55
1 1 137 4 119 0.05 -1.62 1.37 0.99
1 1 138 14 105 0.1 -2.72 -1.25 0.01
1 1 139 13 634 0.07 0.35 -1.44 0.85
1 1 140 9 97 0.06 -1.95 1.17 0.96
1 1 141 18 326 0.1 -0.07 1.64 0.93
1 1 142 11 559 0.07 0.98 0.33 -0.25
1 1 143 11 362 0.08 -1.25 -1.42 0.99
1 1 144 8 275 0.07 -0.32 1.52 0.89
1 1 145 7 93 0.07 -2.34 0.43 -0.92
1 1 146 9 117 0.04 -1.98 0.68 0.99
1 1 147 14 152 0.09 -1.70 0.76 -0.99
1 1 148 19 590 0.05 0.61 -0.83 -0.25
1 1 149 8 151 0.08 -1.10 1.75 1.00
1 1 150 16 743 0.11 2.28 1.07 -0.85
1 1 151 12 78 0.09 -2.39 0.60 0.88
1 1 152 14 694 0.08 0.69 -1.58 -0.96
1 1 153 19 305 0.07 -0.66 0.87 -0.42
1 1 154 11 684 0.06 1.72 0.76 0.99
1 1 155 6 148 0.04 -0.39 2.73 -0.65
1 1 156 12 785 0.09 2.31 0.07 -0.95
1 1 157 18 595 0.08 0.74 -0.67 -0.01
1 1 158 10 81 0.09 -2.57 0.02 0.82
1 1 159 9 676 0.08 0.38 -1.76 -0.98
1 1 160 9 890 0.11 1.98 -2.10 -0.44
1 1 161 10 490 0.06 -0.19 -0.99 0.13
1 1 162 10 10 0.07 -2.54 1.53 0.24
1 1 163 14 223 0.07 -1.63 -0.41 -0.95
1 1 164 14 307 0.06 -1.13 -0.21 0.52
1 1 165 14 712 0.13 2.02 2.10 -0.38
1 1 166 17 599 0.07 1.07 0.07 0.37
1 1 167 14 741 0.09 1.95 0.17 -0.99
1 1 168 8 641 0.12 1.70 1.96 -0.79
1 1 169 10 253 0.08 -1.25 0.23 -0.69
1 1 170 11 389 0.11 0.70 2.77 0.50
1 1 171 11 134 0.1 -2.34 -0.63 0.90
1 1 172 13 624 0.07 0.85 -0.65 0.37
1 1 173 7 67 0.05 -2.28 1.03 0.86
1 1 174 7 162 0.07 -1.70 0.41 0.96
1 1 175 6 883 0.07 2.82 -0.95 -0.21
1 1 176 10 862 0.08 2.97 -0.10 0.24
1 1 177 12 709 0.09 0.31 -2.22 -0.97
1 1 178 11 562 0.07 1.13 0.68 -0.72
1 1 179 13 461 0.09 -0.98 -2.16 0.92
1 1 180 9 339 0.06 -1.13 -0.54 -0.66
1 1 181 8 799 0.09 1.18 -1.98 0.95
1 1 182 14 386 0.07 -0.86 -0.76 0.52
1 1 183 16 94 0.1 -0.89 2.70 0.52
1 1 184 9 703 0.07 0.25 -2.05 1.00
1 1 185 10 250 0.1 -1.76 -1.41 -0.96
1 1 186 9 150 0.1 -2.30 -1.51 0.65
1 1 187 13 487 0.07 -0.29 -1.10 -0.51
1 1 188 11 832 0.11 1.67 -1.67 0.93
1 1 189 12 302 0.11 -1.63 -1.78 -0.90
1 1 190 8 751 0.04 1.29 -1.39 -0.99
1 1 191 16 128 0.11 -0.87 2.35 0.85
1 1 192 12 683 0.09 -0.37 -2.77 0.60
1 1 193 8 395 0.06 -1.07 -1.41 -0.97
1 1 194 13 265 0.07 -1.46 -0.58 -0.90
1 1 195 4 19 0.07 -2.75 1.07 0.28
1 1 196 14 817 0.08 1.34 -2.08 -0.88
1 1 197 7 6 0.06 -2.20 1.97 -0.31
1 1 198 9 365 0.06 -0.27 0.97 -0.06
1 1 199 15 746 0.07 1.96 0.21 1.00
1 1 200 11 506 0.06 -0.13 -1.04 0.30
1 1 201 11 524 0.07 -0.34 -1.69 -0.96
1 1 202 10 242 0.07 -1.37 -0.04 0.78
1 1 203 9 818 0.06 2.42 -0.19 0.90
1 1 204 8 88 0.09 -1.27 2.21 -0.83
1 1 205 9 854 0.07 2.72 -0.46 0.65
1 1 206 8 5 0.07 -2.44 1.68 -0.25
1 1 207 14 514 0.09 0.87 0.99 0.73
1 1 208 16 607 0.08 1.15 0.10 -0.54
1 1 209 9 734 0.1 -0.11 -2.92 0.38
1 1 210 8 278 0.05 -0.63 1.06 0.65
1 1 211 11 284 0.06 -0.76 0.79 0.44
1 1 212 15 526 0.06 0.03 -1.09 -0.41
1 1 213 14 222 0.08 -0.15 2.27 0.96
1 1 214 9 810 0.12 2.29 -0.41 0.94
1 1 215 12 847 0.09 2.51 -0.88 -0.75
1 1 216 11 73 0.11 -1.20 2.55 -0.56
1 1 217 9 376 0.08 0.18 1.53 0.89
1 1 218 14 672 0.08 0.64 -1.34 0.86
1 1 219 10 387 0.06 -0.04 1.00 -0.06
1 1 220 14 122 0.11 -2.53 -1.40 0.44
1 1 221 17 258 0.1 -1.55 -0.90 0.97
1 1 222 14 177 0.08 -1.76 0.18 -0.97
1 1 223 10 687 0.05 1.39 -0.35 -0.82
1 1 224 12 201 0.06 -1.08 1.18 0.92
1 1 225 10 43 0.11 -1.49 2.47 -0.45
1 1 226 7 366 0.05 -1.02 -0.73 -0.66
1 1 227 7 36 0.09 -2.11 1.72 0.69
1 1 228 5 28 0.08 -2.89 0.61 0.28
1 1 229 16 797 0.12 0.60 -2.79 -0.49
1 1 230 12 602 0.07 0.92 -0.38 -0.01
1 1 231 4 878 0.05 1.96 -1.93 0.66
1 1 232 13 450 0.07 0.56 1.34 -0.84
1 1 233 17 597 0.08 0.55 -0.98 0.47
1 1 234 10 310 0.06 -0.70 0.76 0.24
1 1 235 10 449 0.1 0.79 2.10 0.97
1 1 236 15 431 0.08 0.35 1.09 0.51
1 1 237 12 788 0.1 2.46 0.32 -0.87
1 1 238 12 299 0.05 -0.96 0.27 0.00
1 1 239 10 693 0.06 1.33 -0.61 -0.84
1 1 240 10 512 0.1 -0.74 -2.20 0.94
1 1 241 11 276 0.08 -1.01 0.39 0.39
1 1 242 10 90 0.06 -2.09 0.97 0.95
1 1 243 21 530 0.09 0.84 0.53 0.01
1 1 244 13 589 0.07 1.03 0.07 -0.27
1 1 245 10 240 0.07 -1.44 -0.23 0.84
1 1 246 11 13 0.08 -1.97 2.18 0.36
1 1 247 8 803 0.05 2.46 0.16 0.88
1 1 248 10 293 0.05 -1.10 0.03 0.43
1 1 249 14 504 0.07 0.89 1.23 0.88
1 1 250 5 9 0.07 -2.32 1.82 0.31
1 1 251 14 527 0.07 0.86 0.56 -0.23
1 1 252 22 616 0.07 1.45 1.01 0.97
1 1 253 13 804 0.12 0.49 -2.95 0.07
1 1 254 19 95 0.14 -2.80 -0.93 -0.28
1 1 255 8 126 0.09 -2.49 -1.65 -0.13
1 1 256 10 802 0.11 2.57 0.40 0.79
1 1 257 9 52 0.1 -1.41 2.48 0.51
1 1 258 11 555 0.07 0.28 -1.03 -0.36
1 1 259 14 786 0.1 0.98 -2.25 -0.89
1 1 260 11 325 0.04 -1.00 -0.05 0.04
1 1 261 17 610 0.07 0.87 -0.53 0.19
1 1 262 9 454 0.04 -0.49 -0.94 0.34
1 1 263 7 852 0.08 2.87 -0.06 0.47
1 1 264 13 689 0.09 1.63 0.30 -0.94
1 1 265 10 615 0.08 0.21 -1.59 -0.91
1 1 266 11 298 0.06 -0.98 0.26 -0.17
1 1 267 9 632 0.04 0.81 -0.89 -0.60
1 1 268 11 649 0.05 1.28 0.02 0.69
1 1 269 7 304 0.06 -1.27 -0.52 -0.78
1 1 270 10 221 0.06 -1.40 0.20 0.81
1 1 271 12 295 0.06 -0.92 0.43 0.17
1 1 272 9 37 0.11 -2.88 0.32 0.43
1 1 273 10 700 0.05 1.52 -0.22 -0.89
1 1 274 7 557 0.05 1.43 2.30 -0.70
1 1 275 11 737 0.09 2.15 2.08 0.07
1 1 276 11 153 0.1 -0.67 2.32 -0.90
1 1 277 20 445 0.07 -0.59 -1.03 -0.58
1 1 278 7 850 0.09 2.03 -1.44 0.87
1 1 279 15 568 0.08 1.15 0.81 0.80
1 1 280 16 655 0.11 1.76 1.64 -0.91
1 1 281 9 798 0.08 2.73 0.96 -0.44
1 1 282 8 869 0.05 2.98 -0.34 -0.06
1 1 283 19 143 0.14 -2.39 -1.00 0.80
1 1 284 11 60 0.09 -1.52 2.24 -0.70
1 1 285 11 171 0.1 -0.13 2.85 -0.51
1 1 286 10 391 0.07 -0.87 -0.85 -0.62
1 1 287 5 857 0.08 2.99 0.00 -0.07
1 1 288 8 355 0.07 -0.99 -0.50 -0.45
1 1 289 16 819 0.11 1.89 -1.20 0.97
1 1 290 8 682 0.05 1.43 0.00 0.82
1 1 291 10 633 0.07 0.00 -1.97 -1.00
1 1 292 15 550 0.07 0.98 0.44 -0.37
1 1 293 9 877 0.07 2.47 -1.29 0.62
1 1 294 9 894 0.09 2.52 -1.58 -0.17
1 1 295 8 763 0.08 2.44 1.03 0.76
1 1 296 10 113 0.09 -0.59 2.86 0.37
1 1 297 10 621 0.05 1.03 -0.33 -0.39
1 1 298 9 182 0.1 -2.04 -0.74 -0.98
1 1 299 5 372 0.06 -1.48 -2.61 -0.08
1 1 300 5 658 0.05 1.25 -0.19 0.68
1 1 301 14 231 0.07 0.00 2.39 -0.92
1 1 302 14 133 0.07 -1.71 1.08 -1.00
1 1 303 10 139 0.07 -1.93 0.52 -1.00
1 1 304 7 875 0.06 2.08 -1.73 0.71
1 1 305 5 337 0.05 -1.60 -2.22 -0.68
1 1 306 11 537 0.06 1.07 0.96 -0.83
1 1 307 13 112 0.08 -1.55 1.59 0.97
1 1 308 12 571 0.06 1.22 0.98 0.90
1 1 309 10 782 0.09 2.59 1.50 -0.10
1 1 310 14 47 0.11 -2.93 0.04 -0.34
1 1 311 8 207 0.04 -0.91 1.33 0.92
1 1 312 6 582 0.05 1.39 1.52 1.00
1 1 313 5 264 0.05 -1.31 -0.14 -0.73
1 1 314 14 380 0.08 0.15 1.45 -0.84
1 1 315 13 159 0.09 -0.90 1.90 0.99
1 1 316 8 44 0.07 -2.21 1.51 0.73
1 1 317 11 16 0.09 -2.79 1.04 -0.19
1 1 318 13 580 0.09 0.55 -0.84 0.02
1 1 319 4 3 0.08 -2.06 2.16 -0.14
1 1 320 12 217 0.07 -0.41 1.95 1.00
1 1 321 7 891 0.09 1.99 -2.05 0.50
1 1 322 6 375 0.05 -0.10 1.07 0.37
1 1 323 14 648 0.11 1.72 2.36 -0.36
1 1 324 16 861 0.11 1.70 -2.14 -0.67
1 1 325 10 345 0.05 -0.99 -0.31 0.27
1 1 326 9 453 0.07 -1.15 -2.73 -0.26
1 1 327 10 435 0.06 -0.78 -1.36 -0.90
1 1 328 11 477 0.07 -0.33 -0.99 0.29
1 1 329 10 308 0.07 -0.58 0.96 0.47
1 1 330 7 823 0.05 2.58 -0.05 0.82
1 1 331 11 212 0.07 -1.46 0.29 -0.86
1 1 332 10 531 0.08 1.31 2.69 -0.07
1 1 333 16 168 0.12 -2.24 -1.82 -0.45
1 1 334 7 469 0.06 0.53 0.91 -0.34
1 1 335 12 680 0.06 1.50 0.09 -0.86
1 1 336 15 626 0.08 0.76 -0.84 0.50
1 1 337 13 756 0.09 0.46 -2.55 -0.81
1 1 338 13 408 0.12 0.82 2.87 -0.11
1 1 339 9 241 0.05 -0.20 1.98 -1.00
1 1 340 10 511 0.09 -0.95 -2.83 0.12
1 1 341 17 407 0.08 -0.98 -1.35 0.94
1 1 342 16 320 0.08 -1.40 -1.18 0.98
1 1 343 10 539 0.06 0.18 -1.01 0.23
1 1 344 9 697 0.06 1.11 -1.06 -0.88
1 1 345 22 815 0.11 0.96 -2.55 -0.68
1 1 346 14 343 0.07 -0.30 1.18 -0.62
1 1 347 12 35 0.09 -1.44 2.62 -0.08
1 1 348 12 502 0.08 0.90 1.25 -0.89
1 1 349 12 247 0.07 -1.20 0.33 0.65
1 1 350 7 881 0.06 1.82 -2.13 0.60
1 1 351 17 898 0.12 2.02 -2.20 0.09
1 1 352 4 245 0.06 -0.78 1.17 0.80
1 1 353 8 239 0.08 0.24 2.75 -0.65
1 1 354 9 285 0.09 0.49 2.93 0.19
1 1 355 14 768 0.1 2.51 1.26 -0.57
1 1 356 10 760 0.05 1.45 -1.32 -1.00
1 1 357 6 442 0.07 -1.15 -2.58 -0.57
1 1 358 8 481 0.08 -0.67 -1.79 -0.99
1 1 359 9 65 0.06 -2.42 0.79 -0.83
1 1 360 13 363 0.07 -0.25 1.01 -0.29
1 1 361 6 190 0.05 -1.75 -0.11 -0.97
1 1 362 12 423 0.07 0.58 1.85 1.00
1 1 363 14 622 0.06 0.99 -0.38 0.34
1 1 364 16 809 0.11 0.61 -2.85 0.39
1 1 365 11 691 0.1 -0.39 -2.94 0.22
1 1 366 14 344 0.07 -0.46 0.89 -0.01
1 1 367 10 218 0.07 -1.57 -0.13 -0.90
1 1 368 10 300 0.06 -0.32 1.39 -0.82
1 1 369 14 470 0.09 0.54 0.95 0.41
1 1 370 7 695 0.09 1.91 2.31 -0.08
1 1 371 11 205 0.1 0.18 2.97 0.17
1 1 372 14 367 0.07 -0.88 -0.47 0.06
1 1 373 12 381 0.06 -0.10 1.01 0.14
1 1 374 5 63 0.07 -2.14 1.35 0.85
1 1 375 12 645 0.06 1.65 1.30 0.99
1 1 376 6 628 0.09 1.62 2.52 -0.06
1 1 377 20 164 0.08 -1.97 -0.36 0.99
1 1 378 7 762 0.1 2.46 1.60 -0.35
1 1 379 4 87 0.08 -2.39 0.30 0.91
1 1 380 7 859 0.07 1.87 -1.76 0.82
1 1 381 14 179 0.07 -1.93 -0.57 1.00
1 1 382 14 671 0.07 1.20 -0.43 0.69
1 1 383 9 856 0.12 2.43 -1.16 -0.71
1 1 384 13 471 0.07 -0.44 -1.03 0.47
1 1 385 7 535 0.06 1.00 0.82 -0.71
1 1 386 7 770 0.06 1.99 -0.30 -1.00
1 1 387 14 262 0.07 -1.33 -0.27 0.76
1 1 388 10 72 0.08 -1.72 1.83 -0.86
1 1 389 11 243 0.08 -1.71 -1.32 0.98
1 1 390 10 739 0.08 0.56 -2.06 0.99
1 1 391 12 750 0.07 1.76 -0.53 -0.98
1 1 392 5 281 0.04 -1.28 -0.32 -0.73
1 1 393 9 574 0.08 1.32 1.35 0.99
1 1 394 15 884 0.1 2.84 -0.89 0.17
1 1 395 12 731 0.1 0.76 -1.85 -1.00
1 1 396 5 789 0.06 1.94 -0.92 -0.99
1 1 397 5 486 0.06 0.85 1.54 0.97
1 1 398 10 710 0.08 1.68 0.18 0.95
1 1 399 16 724 0.06 1.59 -0.32 0.93
1 1 400 11 188 0.06 -1.34 0.85 0.91
1 1 401 14 238 0.06 -0.42 1.72 -0.97
1 1 402 9 109 0.07 -2.01 0.87 -0.98
1 1 403 9 603 0.07 0.78 -0.66 -0.24
1 1 404 7 409 0.06 -1.32 -2.61 -0.37
1 1 405 6 675 0.03 0.24 -1.80 0.98
1 1 406 8 279 0.05 -1.01 0.38 -0.39
1 1 407 16 132 0.11 -0.91 2.27 -0.89
1 1 408 18 586 0.08 1.08 0.33 0.50
1 1 409 13 22 0.1 -2.33 1.64 -0.52
1 1 410 8 33 0.1 -2.46 1.30 -0.62
1 1 411 12 204 0.12 -1.96 -2.00 0.59
1 1 412 8 261 0.07 -0.85 0.90 0.64
1 1 413 11 215 0.06 -1.91 -1.14 -0.97
1 1 414 14 776 0.09 0.80 -2.17 0.94
1 1 415 6 456 0.06 0.45 0.90 -0.10
1 1 416 9 338 0.07 -0.16 1.39 0.80
1 1 417 14 306 0.06 -1.01 0.12 0.17
1 1 418 8 14 0.08 -2.14 1.95 0.43
1 1 419 10 814 0.08 0.95 -2.40 0.81
1 1 420 6 406 0.03 0.10 1.02 -0.22
1 1 421 9 790 0.07 1.36 -1.80 -0.97
1 1 422 9 835 0.08 2.84 0.34 0.50
1 1 423 9 426 0.1 0.38 1.21 0.68
1 1 424 14 858 0.08 2.93 -0.20 -0.35
1 1 425 16 234 0.1 -1.09 0.81 -0.77
1 1 426 10 382 0.09 -1.41 -2.46 -0.54
1 1 427 13 417 0.08 0.44 1.57 -0.93
1 1 428 13 496 0.08 0.71 0.74 0.21
1 1 429 11 608 0.06 1.09 -0.06 -0.41
1 1 430 15 727 0.08 1.41 -0.79 0.92
1 1 431 10 392 0.05 0.49 2.07 0.99
1 1 432 4 140 0.04 -1.80 0.70 1.00
1 1 433 10 843 0.11 2.89 0.06 -0.44
1 1 434 5 651 0.04 1.09 -0.56 -0.63
1 1 435 11 841 0.1 2.93 0.40 0.27
1 1 436 10 412 0.06 -0.89 -1.21 -0.86
1 1 437 10 705 0.08 1.98 1.87 0.68
1 1 438 8 554 0.08 -0.64 -2.66 -0.67
1 1 439 10 732 0.07 1.34 -1.09 -0.96
1 1 440 9 673 0.07 0.23 -1.94 -1.00
1 1 441 21 194 0.09 -1.32 0.90 -0.91
1 1 442 10 800 0.09 2.70 0.68 -0.62
1 1 443 9 653 0.06 1.45 0.46 0.88
1 1 444 9 639 0.05 0.94 -0.71 -0.57
1 1 445 13 558 0.06 1.22 1.08 -0.93
1 1 446 11 515 0.1 -0.87 -2.79 -0.37
1 1 447 13 229 0.08 -1.70 -0.72 -0.99
1 1 448 7 666 0.05 1.13 -0.69 -0.73
1 1 449 11 374 0.06 0.02 1.27 0.69
1 1 450 12 821 0.1 2.53 -0.18 -0.84
1 1 451 12 401 0.05 0.11 1.07 0.39
1 1 452 11 208 0.07 -0.69 1.64 -0.97
1 1 453 9 538 0.08 0.08 -1.12 0.48
1 1 454 12 605 0.07 1.48 1.13 -0.99
1 1 455 12 772 0.08 1.74 -0.80 0.99
1 1 456 4 439 0.08 -1.22 -2.74 0.03
1 1 457 8 27 0.06 -2.74 0.93 0.44
1 1 458 11 415 0.06 0.21 1.15 -0.56
1 1 459 13 866 0.09 2.83 -0.62 -0.43
1 1 460 11 183 0.06 -1.62 0.37 -0.94
1 1 461 16 178 0.08 -1.96 -0.42 -1.00
1 1 462 17 333 0.07 -0.39 1.06 0.49
1 1 463 11 522 0.09 1.03 1.38 0.96
1 1 464 6 46 0.06 -2.64 0.80 -0.65
1 1 465 11 96 0.07 -1.81 1.37 -0.96
1 1 466 15 647 0.09 1.59 0.80 -0.97
1 1 467 12 576 0.08 1.43 2.09 0.84
1 1 468 11 795 0.09 2.73 1.21 -0.15
1 1 469 8 354 0.06 0.24 1.95 1.00
1 1 470 9 175 0.09 -0.03 2.97 -0.20
1 1 471 8 840 0.06 1.86 -1.67 -0.86
1 1 472 12 561 0.06 0.01 -1.50 -0.86
1 1 473 11 518 0.04 0.00 -1.00 -0.01
1 1 474 11 106 0.07 -2.52 -0.48 0.82
1 1 475 9 771 0.08 2.49 1.22 0.64
1 1 476 7 678 0.04 1.08 -0.86 -0.79
1 1 477 10 301 0.07 -0.79 0.63 -0.13
1 1 478 9 155 0.08 -2.18 -0.82 0.94
1 1 479 9 98 0.1 -2.61 -0.32 -0.78
1 1 480 7 495 0.05 0.78 0.91 -0.60
1 1 481 6 206 0.06 -0.22 2.29 -0.95
1 1 482 9 413 0.08 0.39 1.51 0.90
1 1 483 17 357 0.08 -1.04 -0.70 0.66
1 1 484 8 85 0.08 -2.72 -0.35 -0.66
1 1 485 7 493 0.07 1.06 2.21 0.89
1 1 486 16 761 0.09 1.57 -1.00 0.99
1 1 487 10 4 0.07 -2.12 2.10 0.15
1 1 488 8 402 0.09 -1.17 -1.92 -0.96
1 1 489 8 706 0.05 1.59 0.02 0.91
1 1 490 11 360 0.07 -1.11 -0.90 -0.82
1 1 491 9 361 0.07 -1.44 -2.08 -0.85
1 1 492 6 384 0.05 -0.86 -0.69 -0.44
1 1 493 12 475 0.07 0.56 0.84 0.14
1 1 494 12 774 0.11 0.20 -2.95 0.24
1 1 495 9 39 0.09 -1.94 1.92 0.68
1 1 496 9 147 0.04 -1.51 1.16 0.99
1 1 497 13 211 0.08 -1.36 0.46 0.82
1 1 498 9 642 0.06 0.82 -0.99 -0.70
1 1 499 12 646 0.06 1.18 -0.33 -0.63
1 1 500 11 416 0.07 0.29 1.25 -0.70

Now let us understand what each column in the above table means:

All the columns after this will contain centroids for each cell. They can also be called a codebook, which represents a collection of all centroids or codewords.

Now let’s try to understand plotHVT function. The parameters have been explained in detail below:

plotHVT <-(hvt.results, line.width, color.vec, pch1 = 21, palette.color = 6, centroid.size = 1.5, title = NULL, maxDepth = NULL, dataset, child.level, hmap.cols, previous_level_heatmap = TRUE, show.points = FALSE, asp = 1, ask = TRUE, tess.label = NULL, quant.error.hmap = NULL, n_cells.hmap = NULL, label.size = 0.5, sepration_width = 7, layer_opacity = c(0.5, 0.75, 0.99), dim_size = 1000, heatmap = '2Dhvt') 

Let’s plot the Voronoi tessellation for layer 1 (map A).

plotHVT(torus_mapA,
        line.width = c(0.4), 
        color.vec = c("#141B41"),
        centroid.size = 0.01,
        maxDepth = 1, heatmap = '2Dhvt') 

Figure 2: The Voronoi Tessellation for layer 1 (map A) shown for the 900 cells in the dataset ’torus’

Heat Maps

We will now overlay all the features as heatmap over the Voronoi Tessellation plot for better visualization and identification of patterns, trends, and variations in the data.

Now let’s plot the Voronoi Tessellation with the heatmap overlaid for all the features in the torus data for better visualization and interpretation of data patterns and distributions.

The heatmaps displayed below provides a visual representation of the spatial characteristics of the torus data, allowing us to observe patterns and trends in the distribution of each of the features (x,y,z). The sheer green shades highlight regions with higher values in each of the heatmaps, while the indigo shades indicate areas with the lowest values in each of the heatmaps. By analyzing these heatmaps, we can gain insights into the variations and relationships between each of these features within the torus data.

  plotHVT(
  torus_mapA,
  dataset_updated_train,
  child.level = 1,
  hmap.cols = "x",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = TRUE,
  quant.error.hmap = 0.2,
  n_cells.hmap = 900,
  heatmap = '2Dheatmap'
) 

Figure 4: The Voronoi Tessellation with the heat map overlaid for variable ’x’ in the ’torus’ dataset

  plotHVT(
  torus_mapA,
  dataset_updated_train,
  child.level = 1,
  hmap.cols = "y",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = TRUE,
  quant.error.hmap = 0.2,
  n_cells.hmap = 900,
  heatmap = '2Dheatmap'
) 

Figure 5: The Voronoi Tessellation with the heat map overlaid for variable ’y’ in the ’torus’ dataset

plotHVT(
  torus_mapA,
  dataset_updated_train,
  child.level = 1,
  hmap.cols = "z",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = TRUE,
  quant.error.hmap = 0.2,
  n_cells.hmap = 440,
  heatmap = '2Dheatmap'
) 

Figure 6: The Voronoi Tessellation with the heat map overlaid for variable ’z’ in the ’torus’ dataset

5. Map B : Compressed Novelty Map

Let us try to visualize the Map B from the flow diagram below.

Figure 10: Data Segregation with highlighted bounding box in red around map B

Figure 10: Data Segregation with highlighted bounding box in red around map B

In this section, we will manually figure out the novelty cells from the plotted torus_mapA and store it in identified_Novelty_cells variable.

Note: For manual selecting the novelty cells from map A, one can enhance its interactivity by adding plotly elements to the code. This will transform map A into an interactive plot, allowing users to actively engage with the data. By hovering over the centroids of the cells, a tag containing segment child information will be displayed. Users can explore the map by hovering over different cells and selectively choose the novelty cells they wish to consider. Added an image for reference.

Figure 11: Manually selecting novelty cells

Figure 11: Manually selecting novelty cells

The removeNovelty function removes the identified novelty cell(s) from the training dataset (containing 9600 datapoints) and stores those records separately.

It takes input as the cell number (Segment.Child) of the manually identified novelty cell(s) and the compressed HVT map (torus_mapA) with 900 cells. It returns a list of two items: data with novelty, and a subset of the data without novelty.

NOTE: As we are using torus data here, the identified novelty cells given are for demo purpose.

identified_Novelty_cells <<- c(347,504,855,887,138,649,522,853)   #as a example
output_list <- removeNovelty(identified_Novelty_cells, torus_mapA)
data_with_novelty <- output_list[[1]]
data_without_novelty <- output_list[[2]]

Let’s have a look at the data with novelty(containing 104 records). For the sake of brevity, we will only show the first 20 rows.

novelty_data <- data_with_novelty
novelty_data$Row.No <- row.names(novelty_data)
novelty_data <- novelty_data %>% dplyr::select("Row.No","Cell.ID","Cell.Number","x","y","z")
colnames(novelty_data) <- c("Row.No","Cell.ID","Segment.Child","x","y","z")
novelty_data %>% head(100) %>% 
  as.data.frame() %>%
  Table(scroll = TRUE, limit = 20)
Row.No Cell.ID Segment.Child x y z
1 105 138 -2.6568 -1.3920 0.0368
2 105 138 -2.7651 -1.1357 -0.1463
3 105 138 -2.7382 -1.2032 0.1345
4 105 138 -2.7586 -1.1768 -0.0423
5 105 138 -2.7917 -1.0899 -0.0787
6 105 138 -2.6816 -1.3334 -0.1021
7 105 138 -2.7255 -1.2424 -0.0963
8 105 138 -2.7984 -1.0772 0.0543
9 105 138 -2.6813 -1.3455 0.0141
10 105 138 -2.6985 -1.3011 0.0917
11 105 138 -2.6954 -1.3159 0.0331
12 105 138 -2.7876 -1.1062 0.0432
13 105 138 -2.6655 -1.3766 0.0088
14 105 138 -2.6666 -1.3543 0.1355
15 35 347 -1.5346 2.5778 -0.0070
16 35 347 -1.4572 2.6197 0.0670
17 35 347 -1.3234 2.6788 -0.1553
18 35 347 -1.4328 2.6352 0.0303
19 35 347 -1.5143 2.5889 -0.0375
20 35 347 -1.4989 2.5819 -0.1701

5.1 Voronoi Tessellation with highlighted novelty cell

The plotNovelCells function is used to plot the Voronoi tessellation using the compressed HVT map (torus_mapA) containing 900 cells and highlights the identified novelty cell(s) i.e 8 cells (containing 104 records) in red on the map.

plotNovelCells(identified_Novelty_cells, torus_mapA,line.width = c(0.4),centroid.size = 0.01)

Figure 12: The Voronoi Tessellation constructed using the compressed HVT map (map A) with the novelty cell(s) highlighted in red

We pass the dataframe with novelty records (104 records) to trainHVT function along with other model parameters mentioned below to generate map B (layer2)

Model Parameters

colnames(data_with_novelty) <- c("Cell.ID","Segment.Child","x","y","z")
data_with_novelty <- data_with_novelty[,-1:-2]
torus_mapB <- list()
mapA_scale_summary = torus_mapA[[3]]$scale_summary
torus_mapB <- trainHVT(data_with_novelty,
                  n_cells = 17,   
                  depth = 1,
                  quant.err = 0.1,
                  projection.scale = 10,
                  normalize = FALSE,
                  distance_metric = "L1_Norm",
                  error_metric = "max",
                  quant_method = "kmeans"
                  )

The datatable displayed below is the summary from map B (layer 2) showing Cell.ID, Centroids and Quantization Error for each of the 17 cells.

summaryTable(torus_mapB[[3]]$summary,scroll = TRUE)
Segment.Level Segment.Parent Segment.Child n Cell.ID Quant.Error x y z
1 1 1 5 5 0.04 -0.55 2.95 -0.01
1 1 2 11 9 0.07 2.82 1.01 0.08
1 1 3 6 4 0.11 -0.69 2.91 0.07
1 1 4 10 6 0.11 -1.04 2.79 0.17
1 1 5 6 7 0.06 -1.39 2.64 -0.17
1 1 6 7 17 0.09 1.42 -2.63 -0.10
1 1 7 12 15 0.09 0.87 -2.87 0.01
1 1 8 1 2 0 -0.16 2.99 0.12
1 1 9 5 12 0.07 -2.84 -0.89 0.21
1 1 10 6 8 0.04 -1.48 2.61 0.01
1 1 11 5 1 0.05 -0.42 2.95 0.20
1 1 12 8 14 0.09 -2.69 -1.33 0.04
1 1 13 3 3 0.02 -0.29 2.98 0.04
1 1 14 4 11 0.05 -2.89 -0.82 0.02
1 1 15 5 16 0.04 1.21 -2.74 -0.10
1 1 16 4 10 0.1 -2.87 -0.72 0.25
1 1 17 6 13 0.07 -2.77 -1.14 -0.04

Now let’s check the compression summary for HVT (torus_mapB). The table below shows no of cells, no of cells having quantization error below threshold and percentage of cells having quantization error below threshold for each level.

mapB_compression_summary <- torus_mapB[[3]]$compression_summary %>%  dplyr::mutate_if(is.numeric, funs(round(.,4)))
compressionSummaryTable(mapB_compression_summary)
segmentLevel noOfCells noOfCellsBelowQuantizationError percentOfCellsBelowQuantizationErrorThreshold parameters
1 17 15 0.88 n_cells: 17 quant.err: 0.1 distance_metric: L1_Norm error_metric: max quant_method: kmeans

As it can be seen from the table above, 88% of the cells have hit the quantization threshold error.Since we are successfully able to attain the desired compression percentage, so we will not further subdivide the cells

6. Map C : Compressed Map without Novelty

Let us try to visualize the compressed Map C from the flow diagram below.

Figure 13:Data Segregation with highlighted bounding box in red around compressed map C

Figure 13:Data Segregation with highlighted bounding box in red around compressed map C

6.1 Iteration 1:

With the Novelties removed, we construct another hierarchical Voronoi tessellation map C layer 2 on the data without Novelty (containing 9496 records) and below mentioned model parameters.

Model Parameters

torus_mapC <- list()
mapA_scale_summary = torus_mapA[[3]]$scale_summary
torus_mapC <- trainHVT(data_without_novelty,
                  n_cells = 10,
                  depth = 2,
                  quant.err = 0.1,
                  projection.scale = 10,
                  normalize = FALSE,
                  distance_metric = "L1_Norm",
                  error_metric = "max",
                  quant_method = "kmeans",
                  diagnose = FALSE,
                  scale_summary = mapA_scale_summary)

Now let’s check the compression summary for HVT (torus_mapC) where n_cell was set to 10. The table below shows no of cells, no of cells having quantization error below threshold and percentage of cells having quantization error below threshold for each level.

mapC_compression_summary <- torus_mapC[[3]]$compression_summary %>%  dplyr::mutate_if(is.numeric, funs(round(.,4)))
compressionSummaryTable(mapC_compression_summary)
segmentLevel noOfCells noOfCellsBelowQuantizationError percentOfCellsBelowQuantizationErrorThreshold parameters
1 10 0 0 n_cells: 10 quant.err: 0.1 distance_metric: L1_Norm error_metric: max quant_method: kmeans
2 100 0 0 n_cells: 10 quant.err: 0.1 distance_metric: L1_Norm error_metric: max quant_method: kmeans

As it can be seen from the table above, 0% of the cells have hit the quantization threshold error in level 1 and 0% of the cells have hit the quantization threshold error in level 2

6.2 Iteration 2:

Since, we are yet to achive atleast 80% compression at depth 2. Let’s try to compress again using the below mentioned set of model parameters and the data without novelty (containing 9496 records).

Model Parameters

torus_mapC <- list()
torus_mapC <- trainHVT(data_without_novelty,
                  n_cells = 30,    
                  depth = 2,
                  quant.err = 0.1,
                  projection.scale = 10,
                  normalize = FALSE,
                  distance_metric = "L1_Norm",
                  error_metric = "max",
                  quant_method = "kmeans",
                  diagnose = FALSE,
                  scale_summary = mapA_scale_summary)

The datatable displayed below is the summary from map C (layer2). showing Cell.ID, Centroids and Quantization Error for each of the 925 cells.

summaryTable(torus_mapC[[3]]$summary,scroll = TRUE,limit = 500)
Segment.Level Segment.Parent Segment.Child n Cell.ID Quant.Error x y z
1 1 1 368 595 0.55 0.23 -1.20 -0.60
1 1 2 319 136 0.6 -0.39 2.37 -0.76
1 1 3 391 284 0.54 0.05 1.20 0.55
1 1 4 276 175 0.54 2.79 -0.07 0.27
1 1 5 392 600 0.48 -0.77 -0.69 0.20
1 1 6 253 489 0.53 -2.36 1.01 0.62
1 1 7 249 749 0.54 0.77 -2.32 -0.78
1 1 8 272 796 0.58 -1.24 -1.69 0.89
1 1 9 238 885 0.57 -2.22 -1.70 -0.25
1 1 10 423 486 0.52 -1.11 0.40 0.52
1 1 11 315 238 0.65 -1.24 2.00 0.78
1 1 12 350 369 0.51 -0.69 1.03 -0.58
1 1 13 408 401 0.47 1.05 -0.21 -0.27
1 1 14 361 561 0.52 -1.46 0.00 -0.78
1 1 15 286 86 0.62 0.40 2.46 0.70
1 1 16 285 440 0.47 1.61 -0.74 0.89
1 1 17 268 739 0.56 -2.20 -0.53 0.84
1 1 18 269 43 0.61 2.02 1.79 0.48
1 1 19 293 349 0.58 -1.95 1.61 -0.67
1 1 20 293 781 0.6 0.36 -2.39 0.76
1 1 21 415 241 0.52 1.37 0.67 0.79
1 1 22 249 881 0.58 -0.79 -2.68 -0.23
1 1 23 261 586 0.59 2.05 -1.89 0.32
1 1 24 452 248 0.51 0.71 1.01 -0.57
1 1 25 254 55 0.54 1.24 2.24 -0.65
1 1 26 320 449 0.52 1.95 -1.11 -0.83
1 1 27 325 143 0.62 2.20 0.64 -0.82
1 1 28 316 734 0.53 -1.13 -1.33 -0.90
1 1 29 238 735 0.56 -2.67 -0.11 -0.55
1 1 30 357 584 0.58 0.40 -1.19 0.62
2 1 1 21 610 0.09 0.07 -1.16 -0.55
2 1 2 17 512 0.06 0.60 -0.85 -0.27
2 1 3 19 546 0.09 0.46 -1.01 -0.44
2 1 4 12 506 0.06 0.79 -0.91 -0.60
2 1 5 6 746 0.06 -0.30 -1.78 -0.98
2 1 6 14 542 0.08 0.60 -1.06 -0.63
2 1 7 15 678 0.06 0.00 -1.50 -0.86
2 1 8 17 576 0.09 0.20 -1.04 -0.34
2 1 9 17 591 0.09 0.36 -1.23 -0.69
2 1 10 12 645 0.06 0.01 -1.31 -0.72
2 1 11 7 499 0.09 1.00 -0.99 -0.80
2 1 12 16 604 0.09 -0.14 -1.05 -0.34
2 1 13 8 670 0.04 -0.18 -1.37 -0.79
2 1 14 9 504 0.05 0.71 -0.85 -0.45
2 1 15 10 566 0.07 0.65 -1.22 -0.79
2 1 16 4 673 0.05 0.47 -1.65 -0.96
2 1 17 17 570 0.07 0.16 -0.99 -0.07
2 1 18 16 631 0.07 -0.18 -1.17 -0.58
2 1 19 11 671 0.08 -0.40 -1.27 -0.74
2 1 20 10 526 0.06 0.84 -1.03 -0.74
2 1 21 16 627 0.07 -0.39 -1.05 -0.47
2 1 22 10 640 0.07 0.30 -1.43 -0.84
2 1 23 11 701 0.07 0.24 -1.74 -0.97
2 1 24 10 562 0.08 0.87 -1.32 -0.91
2 1 25 12 709 0.09 -0.19 -1.59 -0.92
2 1 26 13 616 0.07 0.69 -1.48 -0.93
2 1 27 18 589 0.09 -0.09 -1.00 -0.10
2 1 28 6 517 0.05 0.55 -0.84 -0.12
2 1 29 14 733 0.08 -0.05 -1.82 -0.98
2 1 30 0 NA NA NA NA NA
2 2 1 7 103 0.07 -0.76 2.82 -0.38
2 2 2 12 61 0.11 -0.14 2.90 -0.41
2 2 3 6 153 0.04 0.03 1.96 -1.00
2 2 4 12 82 0.08 -0.27 2.72 -0.68
2 2 5 13 156 0.1 -0.63 2.30 -0.92
2 2 6 10 112 0.09 -0.60 2.65 -0.70
2 2 7 16 179 0.11 -0.92 2.25 -0.90
2 2 8 8 79 0.16 -0.53 2.92 -0.13
2 2 9 13 162 0.08 -0.24 2.06 -0.99
2 2 10 16 227 0.1 -0.96 1.86 -0.99
2 2 11 15 45 0.12 0.18 2.95 -0.28
2 2 12 20 88 0.09 0.01 2.53 -0.84
2 2 13 8 128 0.07 -1.00 2.66 -0.53
2 2 14 12 89 0.1 -0.49 2.80 -0.52
2 2 15 15 125 0.09 0.02 2.21 -0.97
2 2 16 4 108 0.05 0.39 2.18 -0.98
2 2 17 7 139 0.04 0.17 1.99 -1.00
2 2 18 17 118 0.11 -0.34 2.43 -0.88
2 2 19 7 71 0.07 0.39 2.53 -0.83
2 2 20 10 137 0.09 -0.86 2.53 -0.73
2 2 21 12 204 0.09 -0.24 1.70 -0.96
2 2 22 4 151 0.03 0.32 1.83 -0.99
2 2 23 12 60 0.1 0.17 2.74 -0.66
2 2 24 4 132 0.04 0.32 1.98 -1.00
2 2 25 14 215 0.09 -0.55 1.75 -0.98
2 2 26 9 121 0.07 -1.06 2.76 -0.27
2 2 27 8 198 0.11 -1.19 2.19 -0.86
2 2 28 7 184 0.06 0.06 1.72 -0.96
2 2 29 11 177 0.06 -0.47 2.04 -0.99
2 2 30 10 159 0.12 -1.27 2.53 -0.53
2 3 1 12 332 0.05 -0.05 1.02 0.19
2 3 2 16 327 0.06 0.02 1.00 -0.02
2 3 3 22 190 0.12 0.70 1.46 0.92
2 3 4 9 265 0.07 -0.15 1.37 0.78
2 3 5 17 366 0.1 -0.48 0.99 0.43
2 3 6 11 254 0.08 0.30 1.25 0.70
2 3 7 8 288 0.07 0.63 0.86 0.36
2 3 8 18 310 0.07 0.25 0.97 0.10
2 3 9 15 289 0.07 -0.04 1.20 0.60
2 3 10 11 270 0.08 0.63 0.99 0.56
2 3 11 7 202 0.06 -0.30 1.80 0.98
2 3 12 11 201 0.09 -0.03 1.71 0.96
2 3 13 11 336 0.07 -0.32 1.10 0.52
2 3 14 7 343 0.06 -0.70 1.23 0.81
2 3 15 14 224 0.07 -0.12 1.56 0.90
2 3 16 15 249 0.09 0.12 1.34 0.75
2 3 17 10 293 0.07 0.40 0.95 0.25
2 3 18 16 363 0.08 -0.39 0.94 0.21
2 3 19 18 311 0.07 0.09 1.05 0.33
2 3 20 12 277 0.08 0.40 1.07 0.51
2 3 21 12 307 0.08 -0.25 1.22 0.66
2 3 22 15 348 0.06 -0.23 0.98 0.11
2 3 23 9 367 0.05 -0.63 1.05 0.64
2 3 24 16 283 0.07 0.18 1.13 0.52
2 3 25 7 295 0.05 0.57 0.85 0.20
2 3 26 21 200 0.08 0.35 1.57 0.91
2 3 27 14 259 0.08 -0.46 1.53 0.91
2 3 28 9 318 0.07 -0.49 1.29 0.78
2 3 29 22 242 0.09 0.58 1.18 0.72
2 3 30 6 325 0.05 -0.10 1.07 0.37
2 4 1 4 154 0.06 2.84 0.02 0.54
2 4 2 6 333 0.1 2.84 -0.95 0.04
2 4 3 12 221 0.1 2.90 -0.57 0.26
2 4 4 9 150 0.1 2.52 0.33 0.84
2 4 5 9 191 0.14 2.40 0.10 0.91
2 4 6 7 75 0.07 2.92 0.64 0.13
2 4 7 11 243 0.1 2.84 -0.64 -0.40
2 4 8 9 192 0.1 2.88 -0.33 0.43
2 4 9 6 106 0.04 2.93 0.35 0.31
2 4 10 9 126 0.08 2.82 0.29 0.55
2 4 11 12 160 0.09 2.96 -0.11 0.25
2 4 12 5 257 0.09 2.89 -0.73 -0.18
2 4 13 9 87 0.1 2.68 0.74 0.62
2 4 14 9 123 0.1 2.56 0.55 0.78
2 4 15 15 165 0.12 2.86 -0.07 -0.50
2 4 16 12 185 0.07 2.97 -0.36 -0.01
2 4 17 5 220 0.1 2.78 -0.48 -0.56
2 4 18 10 329 0.12 2.72 -0.81 0.53
2 4 19 12 173 0.08 2.93 -0.22 -0.33
2 4 20 6 129 0.06 2.94 0.12 0.34
2 4 21 8 169 0.06 2.66 0.10 0.75
2 4 22 12 76 0.07 2.92 0.61 -0.13
2 4 23 11 299 0.09 2.84 -0.85 0.24
2 4 24 11 69 0.07 2.80 0.84 0.38
2 4 25 6 134 0.08 2.99 0.00 -0.11
2 4 26 8 109 0.11 2.93 0.29 -0.30
2 4 27 4 100 0.04 2.96 0.38 0.18
2 4 28 19 219 0.12 2.50 -0.20 0.86
2 4 29 16 252 0.14 2.63 -0.48 0.73
2 4 30 4 99 0.04 2.88 0.47 0.39
2 5 1 3 675 0.02 -0.51 -1.23 0.74
2 5 2 15 593 0.08 -0.85 -0.61 -0.31
2 5 3 10 585 0.09 -0.94 -0.48 0.34
2 5 4 15 540 0.06 -0.99 -0.15 -0.05
2 5 5 11 628 0.05 -0.42 -1.02 0.45
2 5 6 14 580 0.07 -0.88 -0.48 -0.12
2 5 7 7 582 0.06 -0.98 -0.45 -0.39
2 5 8 7 658 0.04 -0.47 -1.14 0.64
2 5 9 14 620 0.08 -1.21 -0.47 0.71
2 5 10 11 619 0.07 -0.51 -0.93 -0.34
2 5 11 19 605 0.08 -0.39 -0.92 -0.06
2 5 12 18 629 0.08 -0.90 -0.75 0.55
2 5 13 15 592 0.07 -0.74 -0.67 -0.03
2 5 14 14 623 0.08 -0.70 -0.86 0.44
2 5 15 7 653 0.06 -0.59 -1.06 0.61
2 5 16 21 607 0.06 -0.62 -0.80 -0.15
2 5 17 12 664 0.08 -1.26 -0.67 0.82
2 5 18 18 569 0.08 -1.04 -0.29 0.40
2 5 19 14 579 0.07 -0.88 -0.47 0.06
2 5 20 13 618 0.1 -0.75 -0.79 -0.41
2 5 21 13 609 0.07 -0.57 -0.83 0.14
2 5 22 12 556 0.07 -0.97 -0.29 -0.16
2 5 23 12 617 0.05 -0.47 -0.94 0.30
2 5 24 16 613 0.08 -1.07 -0.55 0.60
2 5 25 13 612 0.08 -0.29 -0.99 0.26
2 5 26 14 596 0.07 -0.22 -0.98 0.05
2 5 27 21 601 0.09 -0.74 -0.71 0.23
2 5 28 9 667 0.1 -1.11 -0.83 0.79
2 5 29 12 550 0.05 -0.98 -0.25 0.15
2 5 30 12 663 0.08 -0.80 -0.99 0.69
2 6 1 15 571 0.1 -2.16 0.49 0.97
2 6 2 10 525 0.07 -2.20 0.76 0.94
2 6 3 15 430 0.1 -1.65 1.08 1.00
2 6 4 9 578 0.07 -2.38 0.56 0.89
2 6 5 12 412 0.07 -2.58 1.52 0.08
2 6 6 6 692 0.06 -2.89 0.33 0.41
2 6 7 8 372 0.07 -2.11 1.64 0.73
2 6 8 10 415 0.11 -2.25 1.43 0.74
2 6 9 9 475 0.08 -2.25 1.06 0.87
2 6 10 5 554 0.05 -2.57 0.71 0.74
2 6 11 5 326 0.05 -2.26 1.90 0.31
2 6 12 3 344 0.08 -2.24 1.80 0.49
2 6 13 10 662 0.1 -2.73 0.41 0.65
2 6 14 11 507 0.09 -2.79 1.04 -0.19
2 6 15 9 450 0.06 -2.68 1.34 0.00
2 6 16 16 527 0.11 -1.92 0.63 1.00
2 6 17 4 520 0.06 -2.68 0.96 0.53
2 6 18 3 376 0.07 -2.44 1.71 0.19
2 6 19 6 384 0.08 -1.92 1.51 0.90
2 6 20 11 437 0.09 -1.93 1.19 0.96
2 6 21 8 516 0.07 -2.75 0.99 0.37
2 6 22 6 522 0.09 -2.47 0.89 0.78
2 6 23 7 647 0.1 -2.54 0.37 0.82
2 6 24 13 476 0.1 -2.02 0.96 0.97
2 6 25 9 536 0.08 -2.83 0.97 0.00
2 6 26 7 456 0.08 -2.66 1.30 0.28
2 6 27 5 575 0.06 -2.70 0.71 0.61
2 6 28 6 635 0.11 -2.90 0.62 0.23
2 6 29 7 413 0.09 -2.52 1.52 0.33
2 6 30 8 463 0.09 -2.48 1.22 0.63
2 7 1 13 721 0.08 1.02 -2.19 -0.91
2 7 2 8 731 0.07 0.31 -1.94 -1.00
2 7 3 5 835 0.06 0.66 -2.86 -0.34
2 7 4 11 669 0.09 0.99 -1.88 -0.99
2 7 5 5 815 0.04 0.61 -2.71 -0.63
2 7 6 8 818 0.06 0.34 -2.58 -0.80
2 7 7 10 774 0.05 1.06 -2.56 -0.63
2 7 8 11 621 0.13 1.34 -1.80 -0.96
2 7 9 13 639 0.09 0.83 -1.65 -0.99
2 7 10 12 766 0.09 0.94 -2.46 -0.77
2 7 11 10 677 0.11 1.63 -2.12 -0.73
2 7 12 7 762 0.05 0.07 -2.05 -1.00
2 7 13 5 741 0.08 1.55 -2.46 -0.40
2 7 14 5 759 0.05 1.31 -2.54 -0.51
2 7 15 12 775 0.09 0.31 -2.24 -0.96
2 7 16 9 847 0.07 0.14 -2.77 -0.63
2 7 17 5 833 0.06 0.48 -2.78 -0.57
2 7 18 4 851 0.06 0.47 -2.93 -0.25
2 7 19 7 805 0.06 0.96 -2.75 -0.39
2 7 20 10 736 0.09 1.24 -2.38 -0.73
2 7 21 11 760 0.08 0.64 -2.30 -0.92
2 7 22 14 679 0.08 1.34 -2.08 -0.88
2 7 23 7 802 0.08 -0.03 -2.28 -0.96
2 7 24 10 704 0.08 0.69 -1.97 -0.99
2 7 25 5 808 0.05 0.85 -2.73 -0.51
2 7 26 5 854 0.07 0.25 -2.89 -0.42
2 7 27 8 801 0.05 0.51 -2.53 -0.81
2 7 28 12 827 0.08 0.12 -2.54 -0.83
2 7 29 4 797 0.06 0.81 -2.64 -0.65
2 7 30 3 786 0.07 1.20 -2.68 -0.35
2 8 1 9 892 0.07 -1.93 -1.99 0.63
2 8 2 10 876 0.1 -1.12 -2.43 0.74
2 8 3 11 850 0.08 -1.76 -1.72 0.88
2 8 4 2 687 0.02 -0.52 -1.32 0.81
2 8 5 11 859 0.08 -1.30 -2.15 0.85
2 8 6 8 716 0.07 -1.17 -1.11 0.92
2 8 7 11 872 0.1 -1.65 -2.03 0.79
2 8 8 8 825 0.09 -1.38 -1.77 0.96
2 8 9 10 852 0.1 -0.77 -2.37 0.86
2 8 10 9 752 0.07 -0.47 -1.76 0.98
2 8 11 10 765 0.06 -1.46 -1.24 0.99
2 8 12 5 699 0.05 -0.68 -1.31 0.85
2 8 13 2 688 0.01 -0.79 -1.18 0.81
2 8 14 10 694 0.04 -1.22 -0.94 0.89
2 8 15 10 804 0.09 -1.04 -1.83 0.99
2 8 16 11 807 0.07 -1.58 -1.49 0.98
2 8 17 7 869 0.08 -2.06 -1.66 0.76
2 8 18 10 778 0.07 -0.92 -1.70 1.00
2 8 19 10 897 0.1 -1.49 -2.35 0.61
2 8 20 10 722 0.08 -1.41 -0.99 0.96
2 8 21 10 755 0.07 -0.97 -1.52 0.98
2 8 22 8 714 0.05 -0.51 -1.48 0.90
2 8 23 10 767 0.08 -1.27 -1.40 0.99
2 8 24 7 730 0.07 -0.69 -1.49 0.93
2 8 25 9 800 0.09 -1.78 -1.28 0.98
2 8 26 13 727 0.05 -0.99 -1.32 0.94
2 8 27 17 834 0.11 -0.93 -2.12 0.94
2 8 28 7 908 0.1 -1.74 -2.28 0.47
2 8 29 7 790 0.1 -0.52 -1.98 1.00
2 8 30 10 837 0.07 -2.00 -1.42 0.88
2 9 1 9 877 0.09 -1.79 -1.94 -0.76
2 9 2 3 913 0.04 -1.99 -2.12 0.42
2 9 3 11 902 0.08 -2.55 -1.55 0.14
2 9 4 12 893 0.09 -2.57 -1.48 -0.24
2 9 5 9 915 0.11 -2.36 -1.83 0.13
2 9 6 5 924 0.06 -1.78 -2.39 0.17
2 9 7 8 901 0.09 -1.70 -2.23 -0.59
2 9 8 6 879 0.08 -2.29 -1.58 0.62
2 9 9 11 899 0.11 -2.23 -1.80 0.49
2 9 10 9 890 0.07 -1.97 -1.93 -0.65
2 9 11 4 914 0.03 -2.03 -2.09 -0.40
2 9 12 7 878 0.11 -2.63 -1.29 0.36
2 9 13 5 922 0.05 -2.09 -2.12 0.18
2 9 14 16 867 0.11 -2.32 -1.45 -0.66
2 9 15 4 866 0.04 -2.79 -1.05 -0.17
2 9 16 14 829 0.08 -2.29 -1.09 -0.84
2 9 17 6 919 0.06 -1.96 -2.22 0.28
2 9 18 7 918 0.06 -1.93 -2.23 -0.29
2 9 19 6 907 0.11 -2.39 -1.76 -0.24
2 9 20 9 846 0.08 -2.53 -1.09 -0.65
2 9 21 13 830 0.1 -2.09 -1.26 -0.89
2 9 22 9 857 0.09 -1.99 -1.61 -0.83
2 9 23 13 898 0.1 -2.21 -1.79 -0.53
2 9 24 8 887 0.1 -2.49 -1.50 0.41
2 9 25 5 916 0.04 -2.12 -2.05 0.32
2 9 26 5 923 0.05 -1.80 -2.37 -0.23
2 9 27 3 912 0.03 -2.15 -1.98 -0.37
2 9 28 9 921 0.06 -2.20 -2.03 0.05
2 9 29 6 864 0.09 -2.68 -1.13 -0.41
2 9 30 6 925 0.06 -2.05 -2.19 -0.03
2 10 1 7 530 0.03 -1.00 -0.02 0.05
2 10 2 16 538 0.07 -1.26 0.07 0.67
2 10 3 12 501 0.07 -1.37 0.41 0.82
2 10 4 19 608 0.08 -1.37 -0.25 0.79
2 10 5 13 404 0.07 -1.16 1.04 0.89
2 10 6 15 505 0.05 -1.02 0.17 0.28
2 10 7 14 503 0.1 -1.22 0.31 0.67
2 10 8 10 396 0.07 -0.76 0.89 0.56
2 10 9 19 496 0.07 -0.98 0.22 -0.10
2 10 10 15 465 0.07 -0.90 0.45 -0.07
2 10 11 14 434 0.09 -1.06 0.80 0.74
2 10 12 18 524 0.06 -1.12 0.15 0.50
2 10 13 7 572 0.06 -1.28 -0.13 0.70
2 10 14 17 411 0.09 -0.70 0.77 0.29
2 10 15 19 557 0.06 -1.15 -0.15 0.54
2 10 16 10 428 0.08 -0.76 0.66 0.09
2 10 17 14 487 0.08 -1.50 0.62 0.92
2 10 18 17 466 0.07 -1.05 0.52 0.57
2 10 19 20 442 0.11 -1.40 0.92 0.94
2 10 20 9 533 0.05 -1.03 0.00 0.23
2 10 21 11 537 0.04 -1.07 -0.04 0.38
2 10 22 13 483 0.06 -0.95 0.33 0.08
2 10 23 14 429 0.07 -0.87 0.73 0.49
2 10 24 16 563 0.1 -1.46 0.06 0.84
2 10 25 9 509 0.04 -1.00 0.13 0.13
2 10 26 12 541 0.1 -1.67 0.33 0.95
2 10 27 14 468 0.08 -1.20 0.60 0.75
2 10 28 17 392 0.08 -0.89 1.02 0.76
2 10 29 15 484 0.08 -1.01 0.36 0.36
2 10 30 17 451 0.08 -0.88 0.53 0.22
2 11 1 10 217 0.08 -0.69 1.88 1.00
2 11 2 8 296 0.09 -0.83 1.51 0.95
2 11 3 9 161 0.1 -1.29 2.58 0.45
2 11 4 11 250 0.08 -1.45 1.96 0.90
2 11 5 11 194 0.08 -0.97 2.21 0.91
2 11 6 8 163 0.19 -1.45 2.59 0.21
2 11 7 13 167 0.1 -0.94 2.44 0.78
2 11 8 9 302 0.08 -1.83 1.86 0.79
2 11 9 9 199 0.1 -1.37 2.31 0.73
2 11 10 12 131 0.12 -0.96 2.68 0.51
2 11 11 12 193 0.07 -0.46 1.96 1.00
2 11 12 11 207 0.1 -1.59 2.24 0.66
2 11 13 6 233 0.09 -1.81 2.12 0.61
2 11 14 8 244 0.06 -0.61 1.68 0.98
2 11 15 9 305 0.08 -1.55 1.75 0.94
2 11 16 10 113 0.15 -0.71 2.73 0.55
2 11 17 9 351 0.07 -1.79 1.62 0.91
2 11 18 12 230 0.09 -1.17 2.02 0.94
2 11 19 12 335 0.06 -1.14 1.47 0.99
2 11 20 5 138 0.05 -1.25 2.68 0.29
2 11 21 8 266 0.09 -1.29 1.83 0.97
2 11 22 12 228 0.07 -0.91 1.89 0.99
2 11 23 9 304 0.12 -2.06 1.93 0.55
2 11 24 15 381 0.1 -1.33 1.29 0.98
2 11 25 13 360 0.07 -0.99 1.26 0.92
2 11 26 9 278 0.06 -1.07 1.69 1.00
2 11 27 15 358 0.08 -1.57 1.49 0.98
2 11 28 17 223 0.21 -1.97 2.19 0.30
2 11 29 14 180 0.14 -1.57 2.44 0.42
2 11 30 9 147 0.11 -0.57 2.36 0.90
2 12 1 21 356 0.09 -0.34 0.97 -0.24
2 12 2 16 373 0.09 -0.70 1.00 -0.62
2 12 3 10 425 0.06 -0.80 0.68 -0.32
2 12 4 11 400 0.07 -0.64 0.78 -0.13
2 12 5 12 280 0.06 -0.47 1.37 -0.84
2 12 6 12 342 0.09 -0.48 1.08 -0.57
2 12 7 16 317 0.08 -0.31 1.16 -0.60
2 12 8 8 352 0.07 -1.17 1.28 -0.96
2 12 9 11 439 0.06 -1.25 0.80 -0.86
2 12 10 8 239 0.1 -0.05 1.40 -0.80
2 12 11 10 435 0.07 -0.79 0.63 -0.13
2 12 12 7 247 0.07 -0.23 1.43 -0.83
2 12 13 12 390 0.06 -1.04 1.03 -0.84
2 12 14 15 459 0.07 -0.93 0.50 -0.31
2 12 15 7 269 0.04 -0.88 1.60 -0.98
2 12 16 14 378 0.08 -0.45 0.89 -0.02
2 12 17 10 398 0.08 -0.61 0.79 0.06
2 12 18 13 416 0.08 -1.31 0.95 -0.92
2 12 19 16 328 0.09 -0.10 1.04 -0.28
2 12 20 15 255 0.08 -0.59 1.56 -0.94
2 12 21 14 414 0.08 -0.84 0.78 -0.52
2 12 22 14 359 0.07 -0.89 1.15 -0.83
2 12 23 17 391 0.07 -0.65 0.87 -0.41
2 12 24 7 452 0.07 -1.02 0.57 -0.55
2 12 25 11 312 0.05 -0.97 1.43 -0.96
2 12 26 5 286 0.05 -1.11 1.58 -1.00
2 12 27 9 337 0.06 -0.64 1.20 -0.77
2 12 28 14 423 0.08 -1.05 0.79 -0.73
2 12 29 8 320 0.05 -0.79 1.35 -0.90
2 12 30 7 279 0.08 -0.07 1.21 -0.61
2 13 1 14 315 0.08 1.42 0.09 -0.81
2 13 2 7 408 0.07 1.07 -0.22 0.42
2 13 3 13 377 0.07 1.07 0.05 0.37
2 13 4 12 479 0.08 0.78 -0.67 -0.25
2 13 5 16 375 0.07 1.04 0.05 -0.28
2 13 6 17 364 0.07 0.99 0.15 -0.03
2 13 7 14 409 0.08 1.01 -0.23 -0.28
2 13 8 8 393 0.05 1.01 -0.06 0.15
2 13 9 12 438 0.08 1.12 -0.53 -0.65
2 13 10 19 340 0.08 0.93 0.39 0.08
2 13 11 14 424 0.08 1.29 -0.55 -0.80
2 13 12 15 417 0.06 0.99 -0.26 0.20
2 13 13 18 338 0.1 0.98 0.35 -0.28
2 13 14 11 454 0.08 0.93 -0.56 -0.41
2 13 15 12 436 0.06 0.99 -0.41 0.36
2 13 16 9 462 0.06 0.84 -0.55 -0.01
2 13 17 12 395 0.06 1.19 -0.28 -0.63
2 13 18 21 361 0.1 1.15 0.04 -0.52
2 13 19 9 350 0.07 1.55 -0.21 -0.90
2 13 20 14 419 0.07 0.96 -0.27 -0.01
2 13 21 17 402 0.08 1.09 -0.26 -0.49
2 13 22 9 472 0.06 0.99 -0.74 -0.64
2 13 23 11 446 0.06 0.91 -0.46 -0.18
2 13 24 10 480 0.06 1.06 -0.84 -0.76
2 13 25 15 488 0.07 0.72 -0.69 -0.01
2 13 26 9 478 0.08 0.87 -0.72 -0.49
2 13 27 14 362 0.07 1.00 0.17 0.18
2 13 28 19 374 0.09 1.38 -0.24 -0.80
2 13 29 17 309 0.1 1.20 0.31 -0.64
2 13 30 20 461 0.08 0.87 -0.55 0.22
2 14 1 15 615 0.09 -1.68 -0.12 -0.94
2 14 2 11 560 0.06 -1.62 0.13 -0.93
2 14 3 20 514 0.08 -1.11 0.14 -0.48
2 14 4 22 491 0.09 -1.03 0.29 -0.37
2 14 5 11 508 0.08 -1.60 0.44 -0.94
2 14 6 8 532 0.06 -1.03 -0.01 -0.25
2 14 7 7 474 0.07 -1.16 0.47 -0.66
2 14 8 16 602 0.08 -1.40 -0.27 -0.82
2 14 9 12 559 0.07 -1.79 0.24 -0.98
2 14 10 10 469 0.08 -1.34 0.58 -0.84
2 14 11 14 523 0.08 -1.27 0.21 -0.70
2 14 12 14 672 0.09 -1.47 -0.59 -0.91
2 14 13 16 549 0.08 -1.31 -0.03 -0.73
2 14 14 19 697 0.08 -1.96 -0.36 -1.00
2 14 15 11 528 0.07 -1.46 0.29 -0.86
2 14 16 16 668 0.09 -1.63 -0.44 -0.95
2 14 17 16 568 0.09 -1.18 -0.21 -0.60
2 14 18 16 555 0.07 -1.06 -0.23 -0.40
2 14 19 15 548 0.08 -1.46 0.09 -0.84
2 14 20 10 630 0.08 -1.24 -0.53 -0.76
2 14 21 7 547 0.06 -1.98 0.40 -1.00
2 14 22 12 633 0.08 -2.02 0.06 -1.00
2 14 23 18 603 0.07 -1.09 -0.49 -0.60
2 14 24 15 464 0.08 -1.60 0.74 -0.97
2 14 25 15 500 0.1 -1.85 0.60 -1.00
2 14 26 15 732 0.08 -1.90 -0.67 -0.99
2 14 27 0 NA NA NA NA NA
2 14 28 0 NA NA NA NA NA
2 14 29 0 NA NA NA NA NA
2 14 30 0 NA NA NA NA NA
2 15 1 5 35 0.1 0.34 2.97 -0.13
2 15 2 10 94 0.05 -0.06 2.62 0.78
2 15 3 4 127 0.04 -0.42 2.46 0.87
2 15 4 10 155 0.06 -0.33 2.19 0.97
2 15 5 9 44 0.12 0.44 2.87 0.43
2 15 6 9 102 0.09 0.28 2.34 0.93
2 15 7 10 93 0.08 0.11 2.51 0.85
2 15 8 18 145 0.1 0.46 1.87 1.00
2 15 9 5 95 0.04 -0.26 2.72 0.68
2 15 10 8 158 0.06 0.10 1.97 1.00
2 15 11 11 58 0.09 0.18 2.82 0.56
2 15 12 4 120 0.04 -0.20 2.41 0.91
2 15 13 3 72 0.03 -0.08 2.82 0.57
2 15 14 5 178 0.05 -0.15 1.94 1.00
2 15 15 13 40 0.12 0.87 2.68 0.56
2 15 16 5 107 0.03 -0.39 2.65 0.74
2 15 17 11 77 0.12 1.04 2.16 0.91
2 15 18 8 31 0.09 0.51 2.94 0.12
2 15 19 14 37 0.12 1.26 2.46 0.64
2 15 20 5 90 0.08 -0.50 2.86 0.41
2 15 21 12 47 0.15 0.15 2.97 0.19
2 15 22 11 135 0.07 -0.12 2.23 0.97
2 15 23 19 105 0.1 0.57 2.17 0.96
2 15 24 16 66 0.1 0.41 2.66 0.72
2 15 25 4 73 0.05 -0.22 2.89 0.44
2 15 26 13 23 0.1 1.13 2.69 0.37
2 15 27 12 63 0.1 0.76 2.46 0.81
2 15 28 5 111 0.03 -0.31 2.55 0.82
2 15 29 17 130 0.1 0.87 1.79 0.99
2 15 30 10 24 0.11 0.86 2.85 0.14
2 16 1 5 455 0.06 1.15 -0.67 0.75
2 16 2 7 385 0.07 1.30 -0.19 0.73
2 16 3 10 403 0.06 1.33 -0.36 0.78
2 16 4 8 467 0.05 1.03 -0.69 0.65
2 16 5 10 493 0.12 1.17 -0.94 0.87
2 16 6 15 379 0.06 1.59 -0.31 0.92
2 16 7 12 422 0.05 1.19 -0.42 0.68
2 16 8 18 334 0.1 2.22 -0.48 0.96
2 16 9 7 448 0.05 1.01 -0.54 0.51
2 16 10 5 494 0.04 1.04 -0.88 0.77
2 16 11 17 573 0.11 1.30 -1.51 0.99
2 16 12 9 410 0.08 2.17 -0.89 0.94
2 16 13 8 427 0.1 2.37 -1.13 0.77
2 16 14 10 420 0.09 1.89 -0.81 1.00
2 16 15 6 355 0.06 1.48 -0.10 0.86
2 16 16 13 453 0.06 1.43 -0.80 0.93
2 16 17 8 331 0.07 1.79 -0.16 0.98
2 16 18 10 418 0.07 1.49 -0.54 0.91
2 16 19 2 397 0.02 1.18 -0.23 0.60
2 16 20 16 529 0.09 1.35 -1.25 0.98

Now let’s check the compression summary for HVT (torus_mapC). The table below shows no of cells, no of cells having quantization error below threshold and percentage of cells having quantization error below threshold for each level.

mapC_compression_summary <- torus_mapC[[3]]$compression_summary %>%  dplyr::mutate_if(is.numeric, funs(round(.,4)))
compressionSummaryTable(mapC_compression_summary)
segmentLevel noOfCells noOfCellsBelowQuantizationError percentOfCellsBelowQuantizationErrorThreshold parameters
1 30 0 0 n_cells: 30 quant.err: 0.1 distance_metric: L1_Norm error_metric: max quant_method: kmeans
2 895 761 0.85 n_cells: 30 quant.err: 0.1 distance_metric: L1_Norm error_metric: max quant_method: kmeans

As it can be seen from the table above, 85% of the cells have hit the quantization threshold error in level 1 and 85% of the cells have hit the quantization threshold error in level 2

Let’s plot the Voronoi tessellation for layer 2 (map C)

plotHVT(torus_mapC,
        line.width = c(0.4,0.2), 
        color.vec = c("#141B41","#0582CA"),
        centroid.size = 0.1,
        maxDepth = 2, 
        heatmap = '2Dhvt') 

Figure 14: The Voronoi Tessellation for layer 2 (map C) shown for the 924 cells in the dataset ’torus’ at level 2

Heat Maps

Now let’s plot all the features for each cell at level two as a heatmap for better visualization.

The heatmaps displayed below provides a visual representation of the spatial characteristics of the torus data, allowing us to observe patterns and trends in the distribution of each of the features (x,y,z). The sheer green shades highlight regions with higher values in each of the heatmaps, while the indigo shades indicate areas with the lowest values in each of the heatmaps. By analyzing these heatmaps, we can gain insights into the variations and relationships between each of these features within the torus data.

  plotHVT(
  torus_mapC,
  dataset_updated_train,
  child.level = 2,
  hmap.cols = "x",
  line.width = c(0.6,0.4),
  color.vec = c("#141B41","#0582CA"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = TRUE,
  quant.error.hmap = 0.2,
  n_cells.hmap = 100,
  heatmap = '2Dheatmap'
) 

Figure 15: The Voronoi Tessellation with the heat map overlaid for feature x in the ’torus’ dataset

  plotHVT(
  torus_mapC,
  dataset_updated_train,
  child.level = 2,
  hmap.cols = "y",
  line.width = c(0.6,0.4),
  color.vec = c("#141B41","#0582CA"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = TRUE,
  quant.error.hmap = 0.2,
  n_cells.hmap = 100,
  heatmap = '2Dheatmap'
) 

Figure 16: The Voronoi Tessellation with the heat map overlaid for feature y in the ’torus’ dataset

  plotHVT(
  torus_mapC,
  dataset_updated_train,
  child.level = 2,
  hmap.cols = "z",
  line.width = c(0.6,0.4),
  color.vec = c("#141B41","#0582CA"),
  palette.color = 6,
  centroid.size = 0.1,
  show.points = TRUE,
  quant.error.hmap = 0.2,
  n_cells.hmap = 100,
  heatmap = '2Dheatmap'
) 

Figure 17: The Voronoi Tessellation with the heat map overlaid for feature z in the ’torus’ dataset

We now have the set of maps (map A, map B & map C) which will be used to score, which map and cell each test record is assigned to, but before that lets view our test dataset

7. Scoring on Test Data

Now once we have built the model, let us try to score using our test dataset (containing 2400 data points) which cell and which layer each point belongs to.

Testing Dataset

The testing dataset includes the following columns:

Let’s have a look at our randomly selected test dataset containing 2400 datapoints.

Table(head(dataset_updated_test))
x y z
1 -2.6282 0.5656 -0.7253
8 -1.0046 -1.8170 -0.9971
9 -1.3507 0.7445 -0.8891
12 -2.3787 1.7986 -0.1878
15 -1.1740 0.5023 0.6908
16 1.1497 -1.5540 -0.9978

The scoreLayeredHVT function is used to score the test data using the scored set of maps. This function takes an input - a test data and a set of maps (map A, map B, map C).

Now, Let us understand the scoreLayeredHVT function.

scoreLayeredHVT(data,
                map_A,
                map_B,
                map_C,
                mad.threshold = 0.2,
                normalize = TRUE, 
                distance_metric="L1_Norm",
                error_metric="max",
                child.level = 1, 
                line.width = c(0.6, 0.4, 0.2),
                color.vec = c("#141B41", "#6369D1", "#D8D2E1"),
                yVar= NULL,
                ...)

Each of the parameters of scoreLayeredHVT function has been explained below:

The function score based on the HVT maps - map A, map B and map C, constructed using trainHVT function. For each test record, the function will assign that record to Layer1 or Layer2. Layer1 contains the cell ids from map A and Layer 2 contains cell ids from map B (novelty map) and map C (map without novelty).

Scoring Algorithm

The Scoring algorithm recursively calculates the distance between each point in the test dataset and the cell centroids for each level. The following steps explain the scoring method for a single point in the test dataset:

  1. Calculate the distance between the point and the centroid of all the cells in the first level.
  2. Find the cell whose centroid has minimum distance to the point.
  3. Check if the cell drills down further to form more cells.
  4. If it doesn’t, return the path. Or else repeat steps 1 to 4 till we reach a level at which the cell doesn’t drill down further.

Note : The Scoring algorithm will not work if some of the variables used to perform quantization are missing. In the test dataset, we should not remove any features

validation_data <- dataset_updated_test
new_score <- scoreLayeredHVT(
    data=validation_data,
    torus_mapA,
    torus_mapB,
    torus_mapC,
    normalize = FALSE
  )

Let’s see which cell and layer each point belongs to and check the Mean Absolute Difference for each of the 2400 records.

act_pred <- new_score[["actual_predictedTable"]]
rownames(act_pred) <- NULL
act_pred %>% head(1000) %>%as.data.frame() %>%Table(scroll = TRUE)
Row.Number act_x act_y act_z Layer1.Cell.ID Layer2.Cell.ID pred_x pred_y pred_z diff
1 -2.6282 0.5656 -0.7253 A56 C735 -2.6744307 -0.1083882 -0.5495668 0.2986507
2 -1.0046 -1.8170 -0.9971 A402 C734 -1.1324839 -1.3326418 -0.8951142 0.2380759
3 -1.3507 0.7445 -0.8891 A194 C561 -1.4573729 -0.0038881 -0.7758762 0.3227616
4 -2.3787 1.7986 -0.1878 A8 C349 -1.9528751 1.6099130 -0.6652502 0.3639874
5 -1.1740 0.5023 0.6908 A235 C486 -1.1082059 0.3959832 0.5157773 0.1157112
6 1.1497 -1.5540 -0.9978 A748 C749 0.7737695 -2.3217321 -0.7810068 0.4534853
7 -0.8428 -0.5436 0.0755 A367 C600 -0.7691625 -0.6921120 0.2002673 0.1156389
8 -1.1130 -0.6516 -0.7040 A339 C734 -1.1324839 -1.3326418 -0.8951142 0.2972133
9 -2.9090 0.3892 0.3549 A37 C489 -2.3565929 1.0053767 0.6199063 0.4778634
10 -2.5275 1.2898 -0.5463 A33 C349 -1.9528751 1.6099130 -0.6652502 0.3378960
11 -0.0373 -2.8690 -0.4944 A738 C881 -0.7910667 -2.6797486 -0.2267249 0.4035644
12 0.4973 -0.9115 -0.2742 A583 C595 0.2284234 -1.2015087 -0.5962098 0.2936317
13 0.0866 -1.4238 -0.8191 A561 C595 0.2284234 -1.2015087 -0.5962098 0.1956683
14 -0.8804 -0.5036 0.1684 A367 C600 -0.7691625 -0.6921120 0.2002673 0.1105389
15 1.3018 -2.2904 0.7729 A836 C781 0.3633024 -2.3921816 0.7580945 0.3516949
16 -0.8820 0.9907 0.7391 A252 C486 -1.1082059 0.3959832 0.5157773 0.3480818
17 1.1034 -0.7534 -0.7478 A666 C401 1.0490181 -0.2102674 -0.2660895 0.3597417
18 -1.2198 2.1861 -0.8641 A88 C136 -0.3880524 2.3702890 -0.7627342 0.3724342
19 -2.0314 -0.2369 -0.9990 A169 C561 -1.4573729 -0.0038881 -0.7758762 0.3433876
20 -0.8999 0.4451 0.0885 A295 C486 -1.1082059 0.3959832 0.5157773 0.2282333
21 -1.9794 0.0358 -0.9998 A144 C561 -1.4573729 -0.0038881 -0.7758762 0.2618797
22 -0.1499 1.5168 0.8795 A275 C284 0.0540627 1.1974407 0.5548169 0.2826684
23 2.4804 -1.1181 -0.6932 A856 C449 1.9465088 -1.1051116 -0.8303191 0.2279996
24 0.7150 -0.7791 -0.3341 A613 C401 1.0490181 -0.2102674 -0.2660895 0.3236204
25 -0.0655 -1.3311 -0.7448 A545 C595 0.2284234 -1.2015087 -0.5962098 0.1907016
26 -1.9362 2.0723 -0.5486 A21 C349 -1.9528751 1.6099130 -0.6652502 0.1985708
27 -0.7796 -0.6522 0.1807 A403 C600 -0.7691625 -0.6921120 0.2002673 0.0233056
28 -0.7437 0.8610 0.5065 A284 C486 -1.1082059 0.3959832 0.5157773 0.2796000
29 1.5052 0.0445 -0.8694 A680 C401 1.0490181 -0.2102674 -0.2660895 0.4380866
30 1.0742 -0.5237 -0.5934 A651 C401 1.0490181 -0.2102674 -0.2660895 0.2219750
31 0.3589 -1.0419 -0.4400 A555 C595 0.2284234 -1.2015087 -0.5962098 0.1487650
32 -0.9306 0.3664 0.0154 A297 C486 -1.1082059 0.3959832 0.5157773 0.2358555
33 2.3888 -1.0670 0.7875 A863 C440 1.6077818 -0.7377123 0.8863730 0.4030596
34 -0.0128 -1.0262 -0.2276 A532 C595 0.2284234 -1.2015087 -0.5962098 0.2617139
35 -0.8115 0.5856 0.0380 A303 C486 -1.1082059 0.3959832 0.5157773 0.3213667
36 1.1944 -1.3305 -0.9773 A721 C449 1.9465088 -1.1051116 -0.8303191 0.3748260
37 0.7976 1.2753 0.8684 A504 C284 0.0540627 1.1974407 0.5548169 0.3783266
38 -1.8079 -1.4936 0.9386 A243 C796 -1.2416195 -1.6918074 0.8854221 0.2725553
39 -0.1581 1.1475 0.5400 A351 C284 0.0540627 1.1974407 0.5548169 0.0923067
40 1.0428 -1.4598 0.9786 A730 C584 0.3972782 -1.1889885 0.6190008 0.4253108
41 2.5993 0.9612 -0.6364 A765 C143 2.1955674 0.6403446 -0.8206917 0.3029599
42 -2.0228 -0.3879 0.9982 A164 C739 -2.2003530 -0.5265358 0.8437657 0.1568744
43 -2.6629 -0.0732 -0.7478 A89 C735 -2.6744307 -0.1083882 -0.5495668 0.0816507
44 0.5941 0.8393 0.2361 A475 C248 0.7051029 1.0093226 -0.5739830 0.3637028
45 -2.5172 -1.0621 0.6812 A143 C739 -2.2003530 -0.5265358 0.8437657 0.3383256
46 2.7387 0.9703 0.4244 A796 C175 2.7908779 -0.0658210 0.2694047 0.4144314
47 -0.5713 2.7477 -0.5913 A127 C136 -0.3880524 2.3702890 -0.7627342 0.2440309
48 1.1330 2.7487 0.2303 A472 C86 0.4002570 2.4565213 0.7038322 0.4994846
49 1.0197 0.2794 0.3336 A569 C401 1.0490181 -0.2102674 -0.2660895 0.3728917
50 -1.8241 0.5070 0.9943 A140 C486 -1.1082059 0.3959832 0.5157773 0.4351445
51 1.3011 0.0356 -0.7157 A663 C401 1.0490181 -0.2102674 -0.2660895 0.3158533
52 -2.3795 -1.7151 0.3594 A146 C885 -2.2205445 -1.6962525 -0.2547697 0.2639909
53 -1.4309 -0.5108 0.8769 A282 C739 -2.2003530 -0.5265358 0.8437657 0.2727744
54 1.4192 -2.5707 0.3509 A868 C586 2.0481682 -1.8887843 0.3179989 0.4479284
55 -0.7163 -2.3849 -0.8716 A513 C881 -0.7910667 -2.6797486 -0.2267249 0.3381635
56 0.3760 0.9701 0.2816 A443 C284 0.0540627 1.1974407 0.5548169 0.2741650
57 -1.0122 -0.2193 0.2647 A345 C600 -0.7691625 -0.6921120 0.2002673 0.2600940
58 -2.8691 0.0079 -0.4946 A47 C735 -2.6744307 -0.1083882 -0.5495668 0.1219748
59 0.7035 -1.6433 0.9772 A696 C584 0.3972782 -1.1889885 0.6190008 0.3729108
60 2.5001 -1.4493 -0.4563 A880 C449 1.9465088 -1.1051116 -0.8303191 0.4239329
61 -1.4724 1.1331 0.9899 A147 C238 -1.2449921 1.9998908 0.7761860 0.4359709
62 -1.3306 0.7596 -0.8838 A194 C561 -1.4573729 -0.0038881 -0.7758762 0.3327283
63 -1.6866 2.1137 0.7101 A42 C238 -1.2449921 1.9998908 0.7761860 0.2071677
64 -0.9082 0.8214 0.6314 A261 C486 -1.1082059 0.3959832 0.5157773 0.2470151
65 1.7650 0.6723 0.9938 A684 C241 1.3715588 0.6683398 0.7897889 0.2004708
66 -0.5579 0.9877 0.5006 A308 C284 0.0540627 1.1974407 0.5548169 0.2919734
67 -1.5528 -1.0406 -0.9914 A256 C734 -1.1324839 -1.3326418 -0.8951142 0.2695479
68 -0.9365 -0.3825 -0.1517 A359 C600 -0.7691625 -0.6921120 0.2002673 0.2763056
69 -2.3865 -0.7924 -0.8574 A142 C735 -2.6744307 -0.1083882 -0.5495668 0.4265919
70 1.0111 0.3283 -0.3495 A559 C401 1.0490181 -0.2102674 -0.2660895 0.2199654
71 1.2039 -2.4845 0.6489 A822 C781 0.3633024 -2.3921816 0.7580945 0.3473702
72 2.4975 1.6226 0.2071 A780 C43 2.0166149 1.7911539 0.4786294 0.3069895
73 -0.3567 2.5598 0.8114 A160 C86 0.4002570 2.4565213 0.7038322 0.3226012
74 0.2155 2.7833 -0.6109 A239 C136 -0.3880524 2.3702890 -0.7627342 0.3894658
75 -0.7802 -0.9000 -0.5880 A391 C600 -0.7691625 -0.6921120 0.2002673 0.3357310
76 1.5563 0.1715 -0.9008 A680 C143 2.1955674 0.6403446 -0.8206917 0.3960734
77 -0.4761 -2.5549 -0.8008 A656 C881 -0.7910667 -2.6797486 -0.2267249 0.3379635
78 -0.5258 0.8626 -0.1423 A322 C369 -0.6886751 1.0313763 -0.5793286 0.2562267
79 2.6189 0.1871 -0.7801 A820 C143 2.1955674 0.6403446 -0.8206917 0.3057230
80 -1.7678 -0.4595 0.9848 A219 C739 -2.2003530 -0.5265358 0.8437657 0.2135410
81 1.0819 -1.0543 -0.8721 A697 C449 1.9465088 -1.1051116 -0.8303191 0.3190671
82 0.1804 2.5732 0.8150 A209 C86 0.4002570 2.4565213 0.7038322 0.1492345
83 0.5760 -1.6638 0.9709 A696 C584 0.3972782 -1.1889885 0.6190008 0.3351442
84 1.4903 -1.1522 0.9932 A759 C440 1.6077818 -0.7377123 0.8863730 0.2129322
85 -1.8403 -0.3725 0.9925 A164 C739 -2.2003530 -0.5265358 0.8437657 0.2209410
86 1.0006 0.0975 -0.1031 A575 C401 1.0490181 -0.2102674 -0.2660895 0.1730583
87 0.0714 1.3707 0.7787 A397 C284 0.0540627 1.1974407 0.5548169 0.1381599
88 -1.1848 -0.4910 0.6966 A309 C600 -0.7691625 -0.6921120 0.2002673 0.3710274
89 1.0072 -0.3094 -0.3233 A621 C401 1.0490181 -0.2102674 -0.2660895 0.0660538
90 0.5539 -0.8888 0.3037 A593 C584 0.3972782 -1.1889885 0.6190008 0.2573704
91 -0.0365 2.9667 0.2552 A205 C86 0.4002570 2.4565213 0.7038322 0.4651893
92 2.8539 -0.0012 0.5204 A852 C175 2.7908779 -0.0658210 0.2694047 0.1262128
93 2.7738 -1.1262 -0.1121 A883 C175 2.7908779 -0.0658210 0.2694047 0.4863205
94 0.8149 -2.6016 0.6874 A822 C781 0.3633024 -2.3921816 0.7580945 0.2439035
95 -1.5061 -2.0681 0.8296 A287 C796 -1.2416195 -1.6918074 0.8854221 0.2321984
96 -0.8881 2.6110 -0.6524 A110 C136 -0.3880524 2.3702890 -0.7627342 0.2836976
97 -1.9411 -0.5091 -1.0000 A178 C561 -1.4573729 -0.0038881 -0.7758762 0.4043543
98 0.9909 1.3267 -0.9389 A502 C248 0.7051029 1.0093226 -0.5739830 0.3226972
99 0.4983 -0.8672 -0.0185 A580 C584 0.3972782 -1.1889885 0.6190008 0.3534371
100 -0.0894 -1.0165 0.2010 A506 C600 -0.7691625 -0.6921120 0.2002673 0.3349611
101 1.2128 2.4181 -0.7090 A509 C55 1.2437441 2.2381567 -0.6486543 0.0904110
102 1.9506 -1.1446 -0.9652 A789 C449 1.9465088 -1.1051116 -0.8303191 0.0594869
103 -1.0119 1.5636 -0.9905 A191 C369 -0.6886751 1.0313763 -0.5793286 0.4222067
104 -2.7687 0.9605 0.3660 A27 C489 -2.3565929 1.0053767 0.6199063 0.2369634
105 -1.7106 -0.9620 -0.9993 A256 C734 -1.1324839 -1.3326418 -0.8951142 0.3509812
106 -2.1339 -2.0978 -0.1232 A176 C885 -2.2205445 -1.6962525 -0.2547697 0.2065873
107 -1.5127 -0.2305 -0.8828 A218 C561 -1.4573729 -0.0038881 -0.7758762 0.1296210
108 -0.2217 1.7554 0.9730 A248 C284 0.0540627 1.1974407 0.5548169 0.4173017
109 0.9590 0.4768 0.3701 A543 C241 1.3715588 0.6683398 0.7897889 0.3412625
110 -0.3019 -1.8261 0.9888 A534 C796 -1.2416195 -1.6918074 0.8854221 0.3924634
111 -1.1893 0.0866 0.5898 A277 C486 -1.1082059 0.3959832 0.5157773 0.1548333
112 0.7867 -1.3333 0.8921 A672 C584 0.3972782 -1.1889885 0.6190008 0.2689442
113 -0.5395 -0.8478 0.0993 A437 C600 -0.7691625 -0.6921120 0.2002673 0.1621060
114 -0.5542 -1.6053 0.9534 A500 C796 -1.2416195 -1.6918074 0.8854221 0.2806349
115 -1.8705 -1.2751 0.9646 A243 C796 -1.2416195 -1.6918074 0.8854221 0.3749219
116 -1.0969 1.1701 0.9182 A201 C238 -1.2449921 1.9998908 0.7761860 0.3732989
117 2.7170 1.1480 -0.3136 A795 C143 2.1955674 0.6403446 -0.8206917 0.5120599
118 -1.3938 -1.5617 0.9956 A311 C796 -1.2416195 -1.6918074 0.8854221 0.1308219
119 0.4011 2.5350 0.8241 A327 C86 0.4002570 2.4565213 0.7038322 0.0665298
120 1.8651 0.9160 -0.9970 A707 C143 2.1955674 0.6403446 -0.8206917 0.2608104
121 2.9855 -0.2779 -0.0572 A869 C175 2.7908779 -0.0658210 0.2694047 0.2444353
122 2.8166 0.9198 -0.2697 A807 C143 2.1955674 0.6403446 -0.8206917 0.4838266
123 -0.8478 2.8620 -0.1727 A102 C136 -0.3880524 2.3702890 -0.7627342 0.5138309
124 1.6004 -0.2125 0.9227 A724 C440 1.6077818 -0.7377123 0.8863730 0.1896404
125 2.6854 0.0011 -0.7282 A820 C175 2.7908779 -0.0658210 0.2694047 0.3900012
126 -1.1373 2.1088 0.9183 A125 C238 -1.2449921 1.9998908 0.7761860 0.1195717
127 -0.2957 -1.1285 -0.5526 A487 C595 0.2284234 -1.2015087 -0.5962098 0.2135806
128 -2.0382 1.6592 0.7781 A36 C489 -2.3565929 1.0053767 0.6199063 0.3768033
129 0.1572 2.8925 0.4426 A228 C86 0.4002570 2.4565213 0.7038322 0.3134226
130 1.0469 1.4040 0.9686 A522 C241 1.3715588 0.6683398 0.7897889 0.4130434
131 1.2215 -0.4196 0.7058 A671 C440 1.6077818 -0.7377123 0.8863730 0.2949890
132 -1.7603 0.5112 -0.9860 A154 C561 -1.4573729 -0.0038881 -0.7758762 0.3427130
133 -1.6974 -0.1938 -0.9566 A190 C561 -1.4573729 -0.0038881 -0.7758762 0.2035543
134 0.9929 0.1188 -0.0020 A575 C401 1.0490181 -0.2102674 -0.2660895 0.2164250
135 1.7340 2.4082 0.2529 A660 C43 2.0166149 1.7911539 0.4786294 0.3751301
136 -0.7561 -0.6579 0.0675 A398 C600 -0.7691625 -0.6921120 0.2002673 0.0600139
137 2.5066 -1.5255 0.3566 A897 C586 2.0481682 -1.8887843 0.3179989 0.2867724
138 -1.4811 -1.2477 -0.9980 A324 C734 -1.1324839 -1.3326418 -0.8951142 0.1788146
139 1.4575 -2.5965 -0.2104 A870 B17 1.4171714 -2.6314714 -0.1032857 0.0608048
140 -0.9140 0.4308 0.1444 A295 C486 -1.1082059 0.3959832 0.5157773 0.2001333
141 -2.7065 1.2871 -0.0778 A11 C489 -2.3565929 1.0053767 0.6199063 0.4431123
142 -1.0018 1.5801 -0.9916 A191 C369 -0.6886751 1.0313763 -0.5793286 0.4247067
143 -0.7712 1.7412 -0.9954 A170 C369 -0.6886751 1.0313763 -0.5793286 0.4028067
144 -1.8570 -0.4838 0.9967 A179 C739 -2.2003530 -0.5265358 0.8437657 0.1796744
145 2.5489 1.5004 -0.2878 A782 C43 2.0166149 1.7911539 0.4786294 0.5298228
146 1.0839 1.3601 0.9654 A522 C241 1.3715588 0.6683398 0.7897889 0.3850100
147 1.5571 -2.0786 -0.8021 A826 C749 0.7737695 -2.3217321 -0.7810068 0.3491853
148 1.4520 1.5170 0.9950 A582 C241 1.3715588 0.6683398 0.7897889 0.3781042
149 -1.2108 1.1009 -0.9316 A194 C369 -0.6886751 1.0313763 -0.5793286 0.3146400
150 -0.9399 0.4056 0.2165 A295 C486 -1.1082059 0.3959832 0.5157773 0.1590667
151 -0.8965 -0.4593 -0.1206 A359 C600 -0.7691625 -0.6921120 0.2002673 0.2270056
152 0.0344 -1.0504 -0.3152 A526 C595 0.2284234 -1.2015087 -0.5962098 0.2087139
153 -0.3279 1.9996 0.9997 A217 C238 -1.2449921 1.9998908 0.7761860 0.3802989
154 1.0029 0.7058 -0.6336 A529 C248 0.7051029 1.0093226 -0.5739830 0.2203122
155 0.2561 1.0052 0.2705 A422 C284 0.0540627 1.1974407 0.5548169 0.2261983
156 -0.7718 -2.3583 0.8765 A512 C796 -1.2416195 -1.6918074 0.8854221 0.3817447
157 0.2451 -2.7181 -0.6844 A735 C749 0.7737695 -2.3217321 -0.7810068 0.3405481
158 -0.0610 1.7878 0.9775 A274 C284 0.0540627 1.1974407 0.5548169 0.3760350
159 0.5714 0.8391 0.1737 A475 C248 0.7051029 1.0093226 -0.5739830 0.3505361
160 -0.3749 -2.0941 0.9919 A591 C781 0.3633024 -2.3921816 0.7580945 0.4233631
161 0.5248 -2.1090 0.9849 A739 C781 0.3633024 -2.3921816 0.7580945 0.2238282
162 -1.2213 1.9954 0.9406 A125 C238 -1.2449921 1.9998908 0.7761860 0.0641989
163 -0.9010 -0.6120 0.4128 A383 C600 -0.7691625 -0.6921120 0.2002673 0.1414940
164 2.8847 -0.3663 -0.4194 A858 C175 2.7908779 -0.0658210 0.2694047 0.3610353
165 1.6088 -0.8000 -0.9791 A742 C449 1.9465088 -1.1051116 -0.8303191 0.2638671
166 1.5763 -0.8737 0.9802 A761 C440 1.6077818 -0.7377123 0.8863730 0.0870988
167 -2.2140 1.8509 -0.4641 A21 C349 -1.9528751 1.6099130 -0.6652502 0.2344207
168 1.5307 0.0571 0.8836 A706 C241 1.3715588 0.6683398 0.7897889 0.2880640
169 -1.7240 -0.2263 0.9653 A202 C739 -2.2003530 -0.5265358 0.8437657 0.2993744
170 2.8861 0.4693 -0.3824 A837 C175 2.7908779 -0.0658210 0.2694047 0.4273826
171 -0.7511 -2.2100 0.9425 A512 C796 -1.2416195 -1.6918074 0.8854221 0.3552634
172 1.4329 -0.4997 0.8759 A716 C440 1.6077818 -0.7377123 0.8863730 0.1411223
173 -2.3673 -0.9540 -0.8337 A129 C735 -2.6744307 -0.1083882 -0.5495668 0.4789585
174 -1.9019 -1.8715 -0.7439 A224 C885 -2.2205445 -1.6962525 -0.2547697 0.3276741
175 2.3833 0.8251 0.8529 A745 C241 1.3715588 0.6683398 0.7897889 0.4105375
176 2.9056 0.5665 0.2788 A841 C175 2.7908779 -0.0658210 0.2694047 0.2521461
177 0.0245 -1.0441 0.2945 A520 C584 0.3972782 -1.1889885 0.6190008 0.2807225
178 0.7527 1.9224 0.9979 A423 C86 0.4002570 2.4565213 0.7038322 0.3935441
179 -1.0088 0.4147 0.4162 A276 C486 -1.1082059 0.3959832 0.5157773 0.0725667
180 0.5810 -0.8152 -0.0462 A580 C584 0.3972782 -1.1889885 0.6190008 0.4075704
181 1.2082 0.5661 0.7461 A620 C241 1.3715588 0.6683398 0.7897889 0.1030958
182 -1.1111 -2.5855 -0.5806 A442 C881 -0.7910667 -2.6797486 -0.2267249 0.2560523
183 -2.2067 0.6132 -0.9569 A86 C349 -1.9528751 1.6099130 -0.6652502 0.5140626
184 -0.6249 -1.1417 0.7156 A465 C600 -0.7691625 -0.6921120 0.2002673 0.3697277
185 -0.6851 1.2404 -0.8125 A267 C369 -0.6886751 1.0313763 -0.5793286 0.1485901
186 2.2753 0.1059 -0.9606 A785 C143 2.1955674 0.6403446 -0.8206917 0.2513618
187 2.3168 1.7429 -0.4376 A740 C43 2.0166149 1.7911539 0.4786294 0.4215561
188 -1.4839 2.2916 0.6833 A62 C238 -1.2449921 1.9998908 0.7761860 0.2078344
189 -0.8593 -1.1445 0.8225 A414 C796 -1.2416195 -1.6918074 0.8854221 0.3308496
190 -1.3309 2.3282 0.7316 A62 C238 -1.2449921 1.9998908 0.7761860 0.1529344
191 0.5556 -1.1805 -0.7187 A619 C595 0.2284234 -1.2015087 -0.5962098 0.1568918
192 1.3851 0.7804 0.9120 A612 C241 1.3715588 0.6683398 0.7897889 0.0826042
193 1.0132 0.3862 -0.4019 A550 C401 1.0490181 -0.2102674 -0.2660895 0.2560320
194 2.8031 0.3416 0.5668 A835 C175 2.7908779 -0.0658210 0.2694047 0.2390128
195 -2.5328 0.0566 -0.8458 A89 C735 -2.6744307 -0.1083882 -0.5495668 0.2009507
196 2.0388 -0.2970 0.9982 A779 C440 1.6077818 -0.7377123 0.8863730 0.3278525
197 2.8053 -0.8471 -0.3664 A883 C175 2.7908779 -0.0658210 0.2694047 0.4771686
198 1.2206 2.4705 -0.6551 A509 C55 1.2437441 2.2381567 -0.6486543 0.0873110
199 -1.5086 -1.8891 0.9087 A364 C796 -1.2416195 -1.6918074 0.8854221 0.1625170
200 0.5153 -0.9291 -0.3478 A583 C595 0.2284234 -1.2015087 -0.5962098 0.2692317
201 -2.3374 1.2861 -0.7443 A55 C349 -1.9528751 1.6099130 -0.6652502 0.2624626
202 -0.7875 0.6754 -0.2710 A290 C369 -0.6886751 1.0313763 -0.5793286 0.2543766
203 -2.2720 -0.4398 -0.9493 A145 C735 -2.6744307 -0.1083882 -0.5495668 0.3778585
204 1.5450 -0.4418 -0.9195 A714 C449 1.9465088 -1.1051116 -0.8303191 0.3846671
205 1.6703 -1.3080 -0.9926 A784 C449 1.9465088 -1.1051116 -0.8303191 0.2137927
206 -1.3192 -1.2514 0.9834 A320 C796 -1.2416195 -1.6918074 0.8854221 0.2053219
207 -0.4311 1.6988 -0.9689 A238 C136 -0.3880524 2.3702890 -0.7627342 0.3069008
208 2.3736 -0.1882 0.9246 A818 C175 2.7908779 -0.0658210 0.2694047 0.3982841
209 -0.4002 -1.4649 -0.8765 A492 C734 -1.1324839 -1.3326418 -0.8951142 0.2943854
210 -2.3193 1.4622 -0.6706 A22 C349 -1.9528751 1.6099130 -0.6652502 0.1731626
211 -0.0976 -0.9967 -0.0533 A501 C595 0.2284234 -1.2015087 -0.5962098 0.3579139
212 -0.5546 -0.8325 -0.0257 A420 C600 -0.7691625 -0.6921120 0.2002673 0.1936393
213 -1.3681 -1.6062 -0.9939 A370 C734 -1.1324839 -1.3326418 -0.8951142 0.2026534
214 -0.8104 -1.7267 -0.9957 A444 C734 -1.1324839 -1.3326418 -0.8951142 0.2722426
215 0.8662 0.5902 -0.3067 A527 C248 0.7051029 1.0093226 -0.5739830 0.2825009
216 -2.7502 -0.3362 -0.6372 A85 C735 -2.6744307 -0.1083882 -0.5495668 0.1304048
217 1.0603 0.3509 -0.4691 A550 C401 1.0490181 -0.2102674 -0.2660895 0.2584866
218 -2.6107 -1.4740 -0.0620 A126 C885 -2.2205445 -1.6962525 -0.2547697 0.2683926
219 2.8072 0.6291 0.4808 A808 C175 2.7908779 -0.0658210 0.2694047 0.3075461
220 0.6596 -1.2931 -0.8362 A650 C595 0.2284234 -1.2015087 -0.5962098 0.2542527
221 2.8809 0.1509 -0.4659 A843 C175 2.7908779 -0.0658210 0.2694047 0.3473493
222 0.3406 2.2672 -0.9562 A330 C136 -0.3880524 2.3702890 -0.7627342 0.3417357
223 -0.1712 1.1737 -0.5811 A343 C369 -0.6886751 1.0313763 -0.5793286 0.2205234
224 0.1953 -1.1820 -0.5973 A549 C595 0.2284234 -1.2015087 -0.5962098 0.0179074
225 0.9986 -0.0710 -0.0482 A575 C401 1.0490181 -0.2102674 -0.2660895 0.1358583
226 0.7792 -1.0729 0.7387 A652 C584 0.3972782 -1.1889885 0.6190008 0.2059032
227 -0.7102 -1.2746 0.8411 A463 C796 -1.2416195 -1.6918074 0.8854221 0.3309830
228 -2.5955 -1.4823 0.1483 A118 C885 -2.2205445 -1.6962525 -0.2547697 0.3306592
229 1.7647 -0.7607 -0.9969 A758 C449 1.9465088 -1.1051116 -0.8303191 0.2309338
230 1.6531 -1.5540 0.9632 A832 C440 1.6077818 -0.7377123 0.8863730 0.3128110
231 -1.2450 -0.3171 -0.6989 A281 C561 -1.4573729 -0.0038881 -0.7758762 0.2008536
232 -0.9681 2.6924 -0.5083 A74 C136 -0.3880524 2.3702890 -0.7627342 0.3855309
233 -1.0457 -0.0036 0.2987 A316 C486 -1.1082059 0.3959832 0.5157773 0.2263888
234 -0.8489 0.5665 0.2018 A288 C486 -1.1082059 0.3959832 0.5157773 0.2479333
235 -0.4699 1.3716 -0.8351 A300 C369 -0.6886751 1.0313763 -0.5793286 0.2715901
236 -0.7789 2.6034 0.6967 A94 C238 -1.2449921 1.9998908 0.7761860 0.3830291
237 0.2395 -2.7690 0.6266 A755 C781 0.3633024 -2.3921816 0.7580945 0.2107051
238 -2.9494 -0.3834 -0.2255 A59 C735 -2.6744307 -0.1083882 -0.5495668 0.2913493
239 -1.0345 0.1995 0.3228 A292 C486 -1.1082059 0.3959832 0.5157773 0.1543888
240 -2.0865 0.0771 0.9961 A141 C739 -2.2003530 -0.5265358 0.8437657 0.2899410
241 2.7766 1.1255 -0.0887 A795 C175 2.7908779 -0.0658210 0.2694047 0.5212345
242 -0.7099 0.9201 -0.5458 A283 C369 -0.6886751 1.0313763 -0.5793286 0.0553432
243 -1.6167 -0.5393 -0.9553 A223 C561 -1.4573729 -0.0038881 -0.7758762 0.2913876
244 1.9360 -1.3374 0.9356 A819 C440 1.6077818 -0.7377123 0.8863730 0.3257110
245 1.3556 -0.2734 -0.7869 A687 C401 1.0490181 -0.2102674 -0.2660895 0.2968417
246 -2.2203 0.7815 0.9353 A92 C489 -2.3565929 1.0053767 0.6199063 0.2251877
247 0.8459 -0.6642 0.3811 A624 C584 0.3972782 -1.1889885 0.6190008 0.4037704
248 -1.0539 2.5758 -0.6220 A110 C136 -0.3880524 2.3702890 -0.7627342 0.3373643
249 -0.9229 1.7818 1.0000 A159 C238 -1.2449921 1.9998908 0.7761860 0.2546656
250 -0.1989 1.0983 -0.4677 A363 C369 -0.6886751 1.0313763 -0.5793286 0.2227758
251 0.8937 1.2254 0.8755 A504 C241 1.3715588 0.6683398 0.7897889 0.3735434
252 -1.8283 1.7365 -0.8532 A72 C349 -1.9528751 1.6099130 -0.6652502 0.1463706
253 2.9154 0.5696 -0.2409 A829 C175 2.7908779 -0.0658210 0.2694047 0.4234159
254 0.8448 0.5898 -0.2445 A527 C248 0.7051029 1.0093226 -0.5739830 0.2962342
255 -2.0144 -1.4850 -0.8645 A184 C885 -2.2205445 -1.6962525 -0.2547697 0.3423758
256 0.9063 0.4267 0.0592 A530 C401 1.0490181 -0.2102674 -0.2660895 0.3683250
257 -0.9552 -1.0013 -0.7876 A405 C734 -1.1324839 -1.3326418 -0.8951142 0.2053800
258 -2.0539 -1.5152 0.8336 A189 C796 -1.2416195 -1.6918074 0.8854221 0.3469033
259 -1.7926 -1.8093 0.8372 A244 C796 -1.2416195 -1.6918074 0.8854221 0.2388984
260 0.1331 1.1610 -0.5557 A415 C248 0.7051029 1.0093226 -0.5739830 0.2473211
261 2.4069 0.3553 0.9014 A803 C175 2.7908779 -0.0658210 0.2694047 0.4790314
262 -2.1630 1.9886 -0.3462 A6 C349 -1.9528751 1.6099130 -0.6652502 0.3026207
263 2.5722 0.6671 0.7536 A783 C241 1.3715588 0.6683398 0.7897889 0.4126900
264 0.3623 -1.1205 -0.5689 A587 C595 0.2284234 -1.2015087 -0.5962098 0.0807317
265 0.4342 -1.0698 0.5340 A597 C584 0.3972782 -1.1889885 0.6190008 0.0803704
266 2.1069 0.2017 -0.9932 A741 C143 2.1955674 0.6403446 -0.8206917 0.2332734
267 -2.4081 -0.2104 0.9088 A99 C739 -2.2003530 -0.5265358 0.8437657 0.1963057
268 -0.6577 2.1081 0.9781 A195 C238 -1.2449921 1.9998908 0.7761860 0.2991384
269 0.3343 -1.3031 -0.7559 A594 C595 0.2284234 -1.2015087 -0.5962098 0.1223861
270 -2.2791 -0.2017 -0.9576 A115 C735 -2.6744307 -0.1083882 -0.5495668 0.2988919
271 -0.4030 2.2477 0.9590 A198 C86 0.4002570 2.4565213 0.7038322 0.4224154
272 0.2779 -1.2284 -0.6720 A594 C595 0.2284234 -1.2015087 -0.5962098 0.0507194
273 2.6515 0.1785 -0.7535 A820 C143 2.1955674 0.6403446 -0.8206917 0.3283230
274 -0.0687 -1.6619 0.9416 A579 C796 -1.2416195 -1.6918074 0.8854221 0.4196683
275 0.4541 -0.9390 0.2902 A564 C584 0.3972782 -1.1889885 0.6190008 0.2118704
276 0.7919 2.8503 -0.2860 A408 C55 1.2437441 2.2381567 -0.6486543 0.4755472
277 2.6308 -0.8052 -0.6600 A847 C449 1.9465088 -1.1051116 -0.8303191 0.3848406
278 -2.2975 0.3545 0.9458 A87 C489 -2.3565929 1.0053767 0.6199063 0.3452877
279 0.5173 -1.0812 0.5981 A597 C584 0.3972782 -1.1889885 0.6190008 0.0829037
280 0.1077 0.9951 -0.0432 A404 C284 0.0540627 1.1974407 0.5548169 0.2846650
281 1.0068 -0.7827 -0.6890 A659 C401 1.0490181 -0.2102674 -0.2660895 0.3458538
282 -2.3813 -0.4896 0.9023 A134 C739 -2.2003530 -0.5265358 0.8437657 0.0921391
283 -1.5178 2.1101 -0.8006 A60 C349 -1.9528751 1.6099130 -0.6652502 0.3568706
284 1.1208 -0.7868 -0.7761 A678 C401 1.0490181 -0.2102674 -0.2660895 0.3861083
285 0.9976 2.3613 -0.8262 A440 C55 1.2437441 2.2381567 -0.6486543 0.1822777
286 1.1307 0.1581 0.5131 A617 C241 1.3715588 0.6683398 0.7897889 0.3425958
287 -1.0325 0.0568 -0.2587 A318 C561 -1.4573729 -0.0038881 -0.7758762 0.3342457
288 -1.6338 -0.8780 0.9894 A258 C739 -2.2003530 -0.5265358 0.8437657 0.3545505
289 1.9919 -1.5481 0.8525 A850 C586 2.0481682 -1.8887843 0.3179989 0.3104845
290 -1.0691 0.8042 -0.7494 A234 C369 -0.6886751 1.0313763 -0.5793286 0.2592242
291 0.9915 -1.2494 0.9143 A699 C584 0.3972782 -1.1889885 0.6190008 0.3166442
292 -0.4769 -1.6798 -0.9672 A524 C734 -1.1324839 -1.3326418 -0.8951142 0.3582759
293 -2.7218 -0.0851 0.6907 A58 C739 -2.2003530 -0.5265358 0.8437657 0.3719828
294 -1.7058 -2.2997 0.5048 A353 C796 -1.2416195 -1.6918074 0.8854221 0.4842317
295 -1.6822 0.2015 0.9521 A186 C486 -1.1082059 0.3959832 0.5157773 0.4016000
296 0.2877 -1.2991 -0.7428 A594 C595 0.2284234 -1.2015087 -0.5962098 0.1011527
297 2.9631 0.4671 -0.0246 A829 C175 2.7908779 -0.0658210 0.2694047 0.3330493
298 -2.4265 0.7479 0.8422 A78 C489 -2.3565929 1.0053767 0.6199063 0.1832258
299 -2.8681 -0.5478 -0.3921 A84 C735 -2.6744307 -0.1083882 -0.5495668 0.2635160
300 1.8017 0.0424 0.9802 A744 C440 1.6077818 -0.7377123 0.8863730 0.3559525
301 -1.4152 -2.5763 -0.3429 A409 C881 -0.7910667 -2.6797486 -0.2267249 0.2812523
302 -1.3721 -2.6648 0.0734 A439 C881 -0.7910667 -2.6797486 -0.2267249 0.2987023
303 -0.9238 0.3833 0.0208 A297 C486 -1.1082059 0.3959832 0.5157773 0.2306888
304 0.7374 0.7256 0.2605 A496 C248 0.7051029 1.0093226 -0.5739830 0.3835009
305 1.6448 1.3347 0.9930 A645 C241 1.3715588 0.6683398 0.7897889 0.3809375
306 -1.3911 1.9937 0.9023 A107 C238 -1.2449921 1.9998908 0.7761860 0.0928042
307 -2.5686 0.0128 -0.8226 A89 C735 -2.6744307 -0.1083882 -0.5495668 0.1666840
308 -1.9598 0.0976 -0.9993 A144 C561 -1.4573729 -0.0038881 -0.7758762 0.2757797
309 0.5344 -0.9608 0.4346 A597 C584 0.3972782 -1.1889885 0.6190008 0.1832371
310 -1.3200 2.5172 -0.5390 A73 C136 -0.3880524 2.3702890 -0.7627342 0.4341976
311 -1.2068 -0.6809 0.7890 A282 C600 -0.7691625 -0.6921120 0.2002673 0.3458607
312 0.3877 -0.9238 0.0604 A567 C584 0.3972782 -1.1889885 0.6190008 0.2777892
313 0.5162 2.8466 -0.4501 A356 C55 1.2437441 2.2381567 -0.6486543 0.5115139
314 -1.1840 0.7419 -0.7979 A234 C369 -0.6886751 1.0313763 -0.5793286 0.3344575
315 1.6467 -0.0003 -0.9355 A729 C143 2.1955674 0.6403446 -0.8206917 0.4347734
316 0.5540 -2.3523 0.9091 A736 C781 0.3633024 -2.3921816 0.7580945 0.1271949
317 -1.4389 -0.3313 -0.8521 A257 C561 -1.4573729 -0.0038881 -0.7758762 0.1407029
318 -2.6780 1.0736 0.4653 A27 C489 -2.3565929 1.0053767 0.6199063 0.1814123
319 2.2345 0.6416 0.9458 A752 C241 1.3715588 0.6683398 0.7897889 0.3485640
320 -1.7897 -0.5774 0.9928 A179 C739 -2.2003530 -0.5265358 0.8437657 0.2035172
321 0.6372 -2.6352 -0.7030 A756 C749 0.7737695 -2.3217321 -0.7810068 0.1760147
322 -2.1763 -1.2862 0.8493 A167 C739 -2.2003530 -0.5265358 0.8437657 0.2630838
323 1.0991 0.3576 0.5361 A586 C241 1.3715588 0.6683398 0.7897889 0.2789625
324 -0.3813 -1.1766 0.6462 A491 C584 0.3972782 -1.1889885 0.6190008 0.2727219
325 0.9702 2.6226 0.6049 A482 C86 0.4002570 2.4565213 0.7038322 0.2783179
326 2.1926 -1.0559 -0.9011 A825 C449 1.9465088 -1.1051116 -0.8303191 0.1220279
327 -2.2882 -0.6052 -0.9303 A142 C735 -2.6744307 -0.1083882 -0.5495668 0.4212585
328 -1.7867 -0.0814 0.9774 A163 C739 -2.2003530 -0.5265358 0.8437657 0.3308077
329 2.1032 -0.7794 -0.9700 A791 C449 1.9465088 -1.1051116 -0.8303191 0.2073613
330 -0.8027 -1.3300 0.8948 A463 C796 -1.2416195 -1.6918074 0.8854221 0.2700349
331 -2.2123 -0.1626 0.9759 A121 C739 -2.2003530 -0.5265358 0.8437657 0.1693391
332 -2.4229 -1.2225 0.7003 A143 C739 -2.2003530 -0.5265358 0.8437657 0.3539923
333 -1.5901 0.5485 0.9481 A181 C486 -1.1082059 0.3959832 0.5157773 0.3555779
334 2.6904 0.1374 0.7201 A830 C175 2.7908779 -0.0658210 0.2694047 0.2514647
335 -2.2261 0.2665 -0.9703 A114 C561 -1.4573729 -0.0038881 -0.7758762 0.4111797
336 -0.3679 -0.9352 0.0993 A460 C600 -0.7691625 -0.6921120 0.2002673 0.2484393
337 -2.8890 -0.6340 0.2876 A69 C739 -2.2003530 -0.5265358 0.8437657 0.4507590
338 -0.1818 -1.1993 -0.6169 A523 C595 0.2284234 -1.2015087 -0.5962098 0.1443741
339 2.9111 -0.2923 0.3782 A865 C175 2.7908779 -0.0658210 0.2694047 0.1518321
340 -1.0786 0.0301 0.3897 A293 C486 -1.1082059 0.3959832 0.5157773 0.1738555
341 2.7470 0.2799 0.6485 A830 C175 2.7908779 -0.0658210 0.2694047 0.2562314
342 1.3447 -0.5565 -0.8386 A693 C449 1.9465088 -1.1051116 -0.8303191 0.3862338
343 -0.8254 -0.7435 0.4576 A386 C600 -0.7691625 -0.6921120 0.2002673 0.1216527
344 0.9310 2.2616 -0.8952 A440 C55 1.2437441 2.2381567 -0.6486543 0.1942444
345 -0.6490 -1.7998 -0.9962 A481 C734 -1.1324839 -1.3326418 -0.8951142 0.3505759
346 1.4657 0.3817 -0.8743 A623 C143 2.1955674 0.6403446 -0.8206917 0.3473734
347 0.7236 1.1596 -0.7740 A479 C248 0.7051029 1.0093226 -0.5739830 0.1229305
348 -1.1351 0.1455 -0.5176 A273 C561 -1.4573729 -0.0038881 -0.7758762 0.2433124
349 -2.4647 1.2051 0.6687 A40 C489 -2.3565929 1.0053767 0.6199063 0.1188747
350 0.1615 -1.2518 -0.6750 A547 C595 0.2284234 -1.2015087 -0.5962098 0.0653350
351 -1.0539 -0.0074 0.3239 A316 C486 -1.1082059 0.3959832 0.5157773 0.2165221
352 -0.9611 0.5012 -0.4010 A270 C369 -0.6886751 1.0313763 -0.5793286 0.3269766
353 -1.4664 -1.0132 0.9760 A258 C796 -1.2416195 -1.6918074 0.8854221 0.3313219
354 1.9382 2.1926 -0.3765 A712 C55 1.2437441 2.2381567 -0.6486543 0.3373890
355 -1.6451 0.2388 0.9413 A186 C486 -1.1082059 0.3959832 0.5157773 0.3732000
356 1.0860 -0.1312 -0.4231 A608 C401 1.0490181 -0.2102674 -0.2660895 0.0910199
357 1.8996 -1.1688 -0.9731 A784 C449 1.9465088 -1.1051116 -0.8303191 0.0844594
358 -0.8169 2.3585 0.8683 A128 C238 -1.2449921 1.9998908 0.7761860 0.2929384
359 -1.9833 -1.8623 0.6934 A204 C796 -1.2416195 -1.6918074 0.8854221 0.3680651
360 -0.5872 -1.8831 -0.9996 A481 C734 -1.1324839 -1.3326418 -0.8951142 0.4000759
361 -0.5589 -1.1118 0.6550 A465 C584 0.3972782 -1.1889885 0.6190008 0.3564553
362 -0.9248 1.0850 -0.8186 A226 C369 -0.6886751 1.0313763 -0.5793286 0.1763400
363 0.0034 2.6566 -0.7542 A210 C136 -0.3880524 2.3702890 -0.7627342 0.2287658
364 -0.1814 -2.9475 0.3027 A734 C881 -0.7910667 -2.6797486 -0.2267249 0.4689477
365 1.5398 -1.1328 0.9961 A761 C440 1.6077818 -0.7377123 0.8863730 0.1909322
366 1.1767 -0.2941 -0.6168 A646 C401 1.0490181 -0.2102674 -0.2660895 0.1874083
367 -0.2093 -1.9874 -1.0000 A633 C749 0.7737695 -2.3217321 -0.7810068 0.5121316
368 1.1630 -2.7254 0.2688 A868 C781 0.3633024 -2.3921816 0.7580945 0.5407369
369 -1.6836 1.7733 0.8954 A80 C238 -1.2449921 1.9998908 0.7761860 0.2614709
370 -0.1807 -2.8425 -0.5296 A701 C881 -0.7910667 -2.6797486 -0.2267249 0.3586644
371 0.0187 -2.2072 -0.9783 A633 C749 0.7737695 -2.3217321 -0.7810068 0.3556316
372 -1.3799 -2.3803 -0.6599 A382 C881 -0.7910667 -2.6797486 -0.2267249 0.4404857
373 -0.9287 1.3986 -0.9470 A191 C369 -0.6886751 1.0313763 -0.5793286 0.3249733
374 0.4249 -1.9171 -0.9993 A676 C749 0.7737695 -2.3217321 -0.7810068 0.3239316
375 0.6745 -1.7354 0.9904 A720 C584 0.3972782 -1.1889885 0.6190008 0.3983442
376 0.2213 -1.4961 0.8730 A611 C584 0.3972782 -1.1889885 0.6190008 0.2456963
377 -1.5839 0.7058 0.9640 A181 C486 -1.1082059 0.3959832 0.5157773 0.4112445
378 -0.1827 -0.9837 0.0313 A490 C600 -0.7691625 -0.6921120 0.2002673 0.3490060
379 -2.5303 -0.3004 0.8365 A99 C739 -2.2003530 -0.5265358 0.8437657 0.1877828
380 2.5398 1.2248 -0.5728 A768 C143 2.1955674 0.6403446 -0.8206917 0.3921932
381 1.2291 0.2791 -0.6730 A638 C401 1.0490181 -0.2102674 -0.2660895 0.3587866
382 -1.0316 0.0238 0.2505 A316 C486 -1.1082059 0.3959832 0.5157773 0.2380221
383 -2.7076 -1.1650 0.3195 A104 C885 -2.2205445 -1.6962525 -0.2547697 0.5308592
384 -1.1706 0.1637 -0.5752 A272 C561 -1.4573729 -0.0038881 -0.7758762 0.2183457
385 2.8309 0.9756 -0.1071 A806 B9 2.8198727 1.0084818 0.0784909 0.0765000
386 -1.6992 -1.2832 0.9916 A243 C796 -1.2416195 -1.6918074 0.8854221 0.3241219
387 -1.5593 2.0745 0.8036 A61 C238 -1.2449921 1.9998908 0.7761860 0.1387770
388 -0.3637 0.9890 -0.3233 A363 C369 -0.6886751 1.0313763 -0.5793286 0.2077933
389 -1.0789 -0.1419 0.4106 A319 C486 -1.1082059 0.3959832 0.5157773 0.2241221
390 -0.3613 0.9326 -0.0155 A344 C369 -0.6886751 1.0313763 -0.5793286 0.3299933
391 2.2785 0.2863 -0.9551 A769 C143 2.1955674 0.6403446 -0.8206917 0.1904618
392 -0.6039 -0.9425 -0.4738 A446 C600 -0.7691625 -0.6921120 0.2002673 0.3632393
393 2.6253 -1.1065 0.5285 A885 C175 2.7908779 -0.0658210 0.2694047 0.4884507
394 -1.5455 0.4921 -0.9258 A185 C561 -1.4573729 -0.0038881 -0.7758762 0.2446797
395 1.5625 -1.1111 0.9966 A761 C440 1.6077818 -0.7377123 0.8863730 0.1762988
396 -2.3129 0.5588 0.9252 A78 C489 -2.3565929 1.0053767 0.6199063 0.2651877
397 1.6575 1.1324 -1.0000 A605 C143 2.1955674 0.6403446 -0.8206917 0.4031437
398 0.9318 0.3663 0.0501 A530 C401 1.0490181 -0.2102674 -0.2660895 0.3366583
399 0.6878 0.9410 -0.5512 A483 C248 0.7051029 1.0093226 -0.5739830 0.0361361
400 -0.6370 -0.8179 -0.2685 A425 C600 -0.7691625 -0.6921120 0.2002673 0.2422393
401 1.6784 0.2716 -0.9540 A689 C143 2.1955674 0.6403446 -0.8206917 0.3397401
402 -2.4782 -0.2700 0.8701 A99 C739 -2.2003530 -0.5265358 0.8437657 0.1869057
403 -1.9190 1.7990 -0.7763 A38 C349 -1.9528751 1.6099130 -0.6652502 0.1113373
404 0.2327 0.9821 -0.1362 A428 C248 0.7051029 1.0093226 -0.5739830 0.3124695
405 0.6633 -0.7504 0.0557 A595 C401 1.0490181 -0.2102674 -0.2660895 0.4158801
406 0.5814 -0.8147 0.0422 A580 C584 0.3972782 -1.1889885 0.6190008 0.3784037
407 0.0502 1.2190 0.6258 A374 C284 0.0540627 1.1974407 0.5548169 0.0321350
408 2.1821 -0.0983 0.9829 A773 C440 1.6077818 -0.7377123 0.8863730 0.4367525
409 0.3589 2.5210 -0.8375 A350 C136 -0.3880524 2.3702890 -0.7627342 0.3241431
410 -1.3236 -2.2095 -0.8177 A361 C734 -1.1324839 -1.3326418 -0.8951142 0.3817962
411 -0.9373 1.1466 -0.8547 A226 C369 -0.6886751 1.0313763 -0.5793286 0.2130733
412 0.4871 -0.8763 0.0716 A567 C584 0.3972782 -1.1889885 0.6190008 0.3166371
413 -2.5717 -0.8854 -0.6941 A129 C735 -2.6744307 -0.1083882 -0.5495668 0.3414252
414 1.6572 -2.5004 -0.0235 A888 C586 2.0481682 -1.8887843 0.3179989 0.4480276
415 -0.2281 -1.2036 -0.6320 A523 C595 0.2284234 -1.2015087 -0.5962098 0.1648016
416 -1.7234 0.1759 -0.9635 A177 C561 -1.4573729 -0.0038881 -0.7758762 0.2111464
417 1.1076 -2.5128 0.6658 A822 C781 0.3633024 -2.3921816 0.7580945 0.3190702
418 1.4901 0.7541 -0.9440 A647 C143 2.1955674 0.6403446 -0.8206917 0.3141770
419 -2.0235 -1.2956 -0.9153 A192 C734 -1.1324839 -1.3326418 -0.8951142 0.3160812
420 2.0184 2.0456 -0.4864 A712 C55 1.2437441 2.2381567 -0.6486543 0.3764890
421 -0.4106 1.2874 -0.7610 A300 C369 -0.6886751 1.0313763 -0.5793286 0.2385901
422 0.7886 -1.7722 0.9982 A747 C781 0.3633024 -2.3921816 0.7580945 0.4284615
423 0.9158 2.6601 0.5818 A482 C86 0.4002570 2.4565213 0.7038322 0.2803846
424 2.4226 -1.7693 -0.0133 A899 C586 2.0481682 -1.8887843 0.3179989 0.2750716
425 2.7399 1.1841 0.1734 A792 C175 2.7908779 -0.0658210 0.2694047 0.4656345
426 1.4578 -1.9239 0.9104 A833 C586 2.0481682 -1.8887843 0.3179989 0.4059617
427 0.9822 -0.5300 -0.4676 A631 C401 1.0490181 -0.2102674 -0.2660895 0.1960204
428 -0.3630 -2.3799 0.9132 A643 C781 0.3633024 -2.3921816 0.7580945 0.2978965
429 2.4091 0.8707 -0.8274 A765 C143 2.1955674 0.6403446 -0.8206917 0.1501988
430 -1.3041 1.0234 0.9396 A196 C238 -1.2449921 1.9998908 0.7761860 0.3996709
431 0.1986 1.3257 0.7517 A397 C284 0.0540627 1.1974407 0.5548169 0.1565599
432 1.4941 -2.3040 0.6659 A860 C781 0.3633024 -2.3921816 0.7580945 0.4370579
433 2.1350 2.0805 -0.1938 A737 C43 2.0166149 1.7911539 0.4786294 0.3600535
434 0.6970 1.4079 -0.9033 A478 C248 0.7051029 1.0093226 -0.5739830 0.2453324
435 -2.0764 1.3292 0.8851 A63 C489 -2.3565929 1.0053767 0.6199063 0.2897366
436 2.8755 -0.5323 0.3815 A876 C175 2.7908779 -0.0658210 0.2694047 0.2210655
437 -0.6149 0.7934 -0.0864 A322 C369 -0.6886751 1.0313763 -0.5793286 0.2682267
438 -0.6429 2.3290 0.9093 A128 C238 -1.2449921 1.9998908 0.7761860 0.3547717
439 -1.3171 -2.6870 -0.1225 A372 C881 -0.7910667 -2.6797486 -0.2267249 0.2125032
440 0.2996 0.9607 0.1119 A422 C284 0.0540627 1.1974407 0.5548169 0.3083983
441 2.6014 0.6376 -0.7346 A800 C143 2.1955674 0.6403446 -0.8206917 0.1648896
442 2.3438 -1.7972 0.3014 A899 C586 2.0481682 -1.8887843 0.3179989 0.1346050
443 0.8123 -2.1626 0.9507 A776 C781 0.3633024 -2.3921816 0.7580945 0.2903949
444 -0.7783 -2.3541 0.8776 A512 C796 -1.2416195 -1.6918074 0.8854221 0.3778114
445 2.3890 1.3297 -0.6790 A768 C143 2.1955674 0.6403446 -0.8206917 0.3414932
446 -0.8188 -2.8672 -0.1901 A515 C881 -0.7910667 -2.6797486 -0.2267249 0.0839365
447 -1.0606 1.6690 -0.9997 A191 C349 -1.9528751 1.6099130 -0.6652502 0.4286040
448 -2.8645 -0.5661 -0.3921 A84 C735 -2.6744307 -0.1083882 -0.5495668 0.2684160
449 0.1097 -2.1341 0.9906 A657 C781 0.3633024 -2.3921816 0.7580945 0.2480631
450 1.1267 -0.3888 -0.5891 A646 C401 1.0490181 -0.2102674 -0.2660895 0.1930750
451 0.7308 1.2867 -0.8540 A450 C248 0.7051029 1.0093226 -0.5739830 0.1943639
452 1.5337 0.3050 0.8998 A669 C241 1.3715588 0.6683398 0.7897889 0.2118307
453 0.1425 1.0985 -0.4514 A410 C248 0.7051029 1.0093226 -0.5739830 0.2581211
454 0.8840 -2.7409 0.4751 A822 C781 0.3633024 -2.3921816 0.7580945 0.3841369
455 1.2002 -1.1481 -0.9408 A732 C449 1.9465088 -1.1051116 -0.8303191 0.2999260
456 -2.4310 -1.7577 -0.0142 A126 C885 -2.2205445 -1.6962525 -0.2547697 0.1708242
457 -0.5807 1.1237 -0.6779 A267 C369 -0.6886751 1.0313763 -0.5793286 0.0996234
458 1.4558 2.4311 -0.5523 A556 C55 1.2437441 2.2381567 -0.6486543 0.1671178
459 -1.5620 -0.0182 -0.8990 A218 C561 -1.4573729 -0.0038881 -0.7758762 0.0806876
460 -1.2296 -0.7823 -0.8400 A346 C734 -1.1324839 -1.3326418 -0.8951142 0.2341907
461 1.4120 0.3401 0.8367 A669 C241 1.3715588 0.6683398 0.7897889 0.1385307
462 -0.9857 -1.9139 0.9883 A432 C796 -1.2416195 -1.6918074 0.8854221 0.1936300
463 2.0201 -1.1335 0.9486 A819 C440 1.6077818 -0.7377123 0.8863730 0.2901110
464 1.4789 -1.7811 -0.9491 A790 C449 1.9465088 -1.1051116 -0.8303191 0.4207927
465 -2.7292 -1.2299 0.1134 A105 B14 -2.6854875 -1.3277500 0.0440500 0.0703042
466 1.7714 1.0370 0.9986 A677 C241 1.3715588 0.6683398 0.7897889 0.3257708
467 -1.6718 -2.0892 0.7371 A287 C796 -1.2416195 -1.6918074 0.8854221 0.3252984
468 0.3546 -1.0268 0.4065 A564 C584 0.3972782 -1.1889885 0.6190008 0.1391225
469 -1.4921 -0.8306 -0.9563 A314 C734 -1.1324839 -1.3326418 -0.8951142 0.3076146
470 -0.0975 1.0362 -0.2827 A385 C369 -0.6886751 1.0313763 -0.5793286 0.2975425
471 -0.6749 2.4205 0.8585 A128 C238 -1.2449921 1.9998908 0.7761860 0.3576717
472 -2.1999 -1.5101 -0.7439 A184 C885 -2.2205445 -1.6962525 -0.2547697 0.2319758
473 -1.7462 2.3686 0.3338 A30 C238 -1.2449921 1.9998908 0.7761860 0.4374344
474 -2.2442 -1.8519 0.4155 A146 C885 -2.2205445 -1.6962525 -0.2547697 0.2831909
475 -1.6038 1.2452 0.9995 A123 C238 -1.2449921 1.9998908 0.7761860 0.4456042
476 0.8203 -2.5723 0.7143 A822 C781 0.3633024 -2.3921816 0.7580945 0.2269702
477 -2.3361 -0.4799 -0.9230 A142 C735 -2.6744307 -0.1083882 -0.5495668 0.3610919
478 -1.8323 -1.1727 0.9845 A243 C739 -2.2003530 -0.5265358 0.8437657 0.3849838
479 0.9794 0.2564 -0.1571 A559 C401 1.0490181 -0.2102674 -0.2660895 0.2150917
480 -0.3035 0.9900 0.2641 A358 C284 0.0540627 1.1974407 0.5548169 0.2852401
481 -2.4279 1.2793 -0.6678 A33 C349 -1.9528751 1.6099130 -0.6652502 0.2693959
482 0.1168 -1.2793 -0.6987 A547 C595 0.2284234 -1.2015087 -0.5962098 0.0973016
483 2.3139 -1.3026 -0.7553 A856 C449 1.9465088 -1.1051116 -0.8303191 0.2132996
484 -0.1695 1.1002 -0.4622 A385 C369 -0.6886751 1.0313763 -0.5793286 0.2350425
485 -2.2592 -0.9424 0.8941 A155 C739 -2.2003530 -0.5265358 0.8437657 0.1750152
486 1.4935 -1.6012 -0.9819 A794 C449 1.9465088 -1.1051116 -0.8303191 0.3668927
487 -1.0212 -0.5923 0.5732 A336 C600 -0.7691625 -0.6921120 0.2002673 0.2415940
488 -1.3584 1.0863 0.9654 A147 C238 -1.2449921 1.9998908 0.7761860 0.4054042
489 0.5632 1.5804 0.9466 A447 C86 0.4002570 2.4565213 0.7038322 0.4272774
490 -1.1704 -2.5002 -0.6492 A442 C881 -0.7910667 -2.6797486 -0.2267249 0.3271190
491 -0.6283 1.4346 0.9010 A246 C284 0.0540627 1.1974407 0.5548169 0.4219017
492 -2.9767 -0.3382 -0.0907 A59 C735 -2.6744307 -0.1083882 -0.5495668 0.3303160
493 -0.9385 0.3856 0.1705 A295 C486 -1.1082059 0.3959832 0.5157773 0.1751221
494 1.0180 0.0346 -0.1917 A589 C401 1.0490181 -0.2102674 -0.2660895 0.1167583
495 1.0265 -2.8012 -0.1817 A831 C749 0.7737695 -2.3217321 -0.7810068 0.4438351
496 -2.8712 0.3054 0.4609 A37 C489 -2.3565929 1.0053767 0.6199063 0.4578634
497 0.1111 -0.9940 0.0181 A518 C595 0.2284234 -1.2015087 -0.5962098 0.3130473
498 1.4622 -0.7011 0.9257 A727 C440 1.6077818 -0.7377123 0.8863730 0.0738404
499 0.4035 -2.3249 0.9331 A736 C781 0.3633024 -2.3921816 0.7580945 0.0941615
500 1.1193 -0.5853 0.6760 A665 C440 1.6077818 -0.7377123 0.8863730 0.2837557
501 -2.9968 0.1381 0.0077 A41 C735 -2.6744307 -0.1083882 -0.5495668 0.3753748
502 1.3912 -0.2976 -0.8165 A687 C401 1.0490181 -0.2102674 -0.2660895 0.3266417
503 0.3283 -0.9662 -0.2014 A555 C595 0.2284234 -1.2015087 -0.5962098 0.2433317
504 -0.9958 0.5747 0.5263 A260 C486 -1.1082059 0.3959832 0.5157773 0.1005485
505 -2.4810 0.7241 0.8114 A53 C489 -2.3565929 1.0053767 0.6199063 0.1990592
506 2.9041 -0.7280 0.1098 A884 C175 2.7908779 -0.0658210 0.2694047 0.3116686
507 2.0338 0.9379 -0.9708 A707 C143 2.1955674 0.6403446 -0.8206917 0.2031437
508 -0.2291 -1.3259 0.7561 A516 C584 0.3972782 -1.1889885 0.6190008 0.3001296
509 -1.4013 -2.6400 -0.1488 A372 C881 -0.7910667 -2.6797486 -0.2267249 0.2426356
510 1.4736 -0.0790 0.8516 A682 C440 1.6077818 -0.7377123 0.8863730 0.2758890
511 -2.9766 -0.3728 -0.0168 A59 C735 -2.6744307 -0.1083882 -0.5495668 0.3664493
512 -0.1213 1.7275 0.9634 A326 C284 0.0540627 1.1974407 0.5548169 0.3713350
513 -1.1289 -1.0378 0.8845 A352 C796 -1.2416195 -1.6918074 0.8854221 0.2558830
514 0.8360 0.9574 -0.6845 A495 C248 0.7051029 1.0093226 -0.5739830 0.0977789
515 -0.5352 -1.8427 0.9967 A534 C796 -1.2416195 -1.6918074 0.8854221 0.3228634
516 0.4658 -1.4548 -0.8814 A650 C595 0.2284234 -1.2015087 -0.5962098 0.2586194
517 0.8797 -2.7792 -0.4032 A831 C749 0.7737695 -2.3217321 -0.7810068 0.3137351
518 -0.9202 2.0714 0.9638 A159 C238 -1.2449921 1.9998908 0.7761860 0.1946384
519 -0.0511 1.6720 -0.9449 A321 C136 -0.3880524 2.3702890 -0.7627342 0.4058024
520 -1.2389 -2.0691 0.9113 A400 C796 -1.2416195 -1.6918074 0.8854221 0.1352967
521 1.0510 1.1983 0.9138 A504 C241 1.3715588 0.6683398 0.7897889 0.3248434
522 -1.4102 2.2635 0.7452 A62 C238 -1.2449921 1.9998908 0.7761860 0.1532677
523 -2.3859 -1.3868 0.6504 A150 C739 -2.2003530 -0.5265358 0.8437657 0.4130590
524 -1.1930 2.6015 -0.5069 A73 C136 -0.3880524 2.3702890 -0.7627342 0.4306643
525 0.6419 -2.3753 -0.8876 A753 C749 0.7737695 -2.3217321 -0.7810068 0.0973435
526 2.6611 -0.7924 -0.6300 A867 C449 1.9465088 -1.1051116 -0.8303191 0.4092073
527 -1.5421 2.1730 -0.7472 A60 C349 -1.9528751 1.6099130 -0.6652502 0.3519373
528 -0.6956 -0.9904 0.6134 A421 C600 -0.7691625 -0.6921120 0.2002673 0.2616611
529 -2.0681 1.9627 -0.5249 A21 C349 -1.9528751 1.6099130 -0.6652502 0.2027874
530 1.2787 0.4898 -0.7760 A623 C143 2.1955674 0.6403446 -0.8206917 0.3707012
531 0.9907 1.8372 -0.9962 A510 C55 1.2437441 2.2381567 -0.6486543 0.3338488
532 -1.7474 -0.5758 0.9871 A219 C739 -2.2003530 -0.5265358 0.8437657 0.2151838
533 1.6102 -1.3687 0.9936 A759 C440 1.6077818 -0.7377123 0.8863730 0.2468777
534 -0.4159 -0.9144 0.0949 A460 C600 -0.7691625 -0.6921120 0.2002673 0.2269726
535 -2.2664 -1.5110 -0.6898 A184 C885 -2.2205445 -1.6962525 -0.2547697 0.2220461
536 0.0392 -2.9012 -0.4329 A738 C881 -0.7910667 -2.6797486 -0.2267249 0.4192977
537 -2.0110 -0.0632 -0.9999 A169 C561 -1.4573729 -0.0038881 -0.7758762 0.2789876
538 -2.2654 0.6002 0.9391 A78 C489 -2.3565929 1.0053767 0.6199063 0.2718544
539 -1.2112 1.4074 0.9897 A172 C238 -1.2449921 1.9998908 0.7761860 0.2799323
540 1.4018 2.6324 0.1869 A565 C55 1.2437441 2.2381567 -0.6486543 0.4626178
541 1.8356 1.5560 -0.9137 A655 C143 2.1955674 0.6403446 -0.8206917 0.4562104
542 -0.9543 0.4334 0.3064 A295 C486 -1.1082059 0.3959832 0.5157773 0.1335667
543 1.3417 -1.2673 0.9880 A759 C440 1.6077818 -0.7377123 0.8863730 0.2990988
544 0.1238 2.0232 0.9996 A354 C86 0.4002570 2.4565213 0.7038322 0.3351821
545 -2.8916 0.3848 0.3986 A37 C489 -2.3565929 1.0053767 0.6199063 0.4589634
546 0.7230 -1.2932 -0.8551 A650 C595 0.2284234 -1.2015087 -0.5962098 0.2817194
547 1.7773 -0.0647 0.9752 A744 C440 1.6077818 -0.7377123 0.8863730 0.3104525
548 -2.9704 0.2639 0.1884 A29 C735 -2.6744307 -0.1083882 -0.5495668 0.4687415
549 -0.2899 1.6172 0.9341 A275 C284 0.0540627 1.1974407 0.5548169 0.3810017
550 -1.5893 -0.5332 -0.9462 A223 C561 -1.4573729 -0.0038881 -0.7758762 0.2771876
551 0.9228 -1.8685 -0.9965 A764 C749 0.7737695 -2.3217321 -0.7810068 0.2725853
552 1.4734 -0.7364 -0.9357 A717 C449 1.9465088 -1.1051116 -0.8303191 0.3157337
553 -0.7369 -0.9890 0.6420 A414 C600 -0.7691625 -0.6921120 0.2002673 0.2569611
554 1.1002 -0.9681 -0.8451 A697 C449 1.9465088 -1.1051116 -0.8303191 0.3327004
555 -1.2650 -0.8080 0.8666 A282 C796 -1.2416195 -1.6918074 0.8854221 0.3086700
556 1.3658 0.2123 0.7864 A669 C241 1.3715588 0.6683398 0.7897889 0.1550625
557 -0.0426 2.9988 0.0416 A175 C86 0.4002570 2.4565213 0.7038322 0.5491226
558 1.9179 1.3814 0.9315 A688 C43 2.0166149 1.7911539 0.4786294 0.3204465
559 2.9610 0.3029 -0.2158 A837 C175 2.7908779 -0.0658210 0.2694047 0.3413493
560 0.8610 1.0333 -0.7556 A507 C248 0.7051029 1.0093226 -0.5739830 0.1204972
561 0.1268 -1.0099 0.1881 A539 C584 0.3972782 -1.1889885 0.6190008 0.2934892
562 0.9634 0.2999 0.1341 A572 C401 1.0490181 -0.2102674 -0.2660895 0.3319917
563 0.8544 1.8170 1.0000 A466 C86 0.4002570 2.4565213 0.7038322 0.4632774
564 1.5107 -1.9060 0.9018 A833 C586 2.0481682 -1.8887843 0.3179989 0.3794950
565 -2.9303 -0.1429 0.3579 A51 C735 -2.6744307 -0.1083882 -0.5495668 0.3992826
566 1.6285 0.7559 0.9788 A684 C241 1.3715588 0.6683398 0.7897889 0.1778375
567 -2.6279 -0.7164 -0.6900 A129 C735 -2.6744307 -0.1083882 -0.5495668 0.2649919
568 -2.7346 1.2031 0.1572 A12 C489 -2.3565929 1.0053767 0.6199063 0.3461456
569 -1.6274 2.3861 -0.4594 A43 C349 -1.9528751 1.6099130 -0.6652502 0.4358374
570 2.9095 -0.6207 0.2223 A876 C175 2.7908779 -0.0658210 0.2694047 0.2402019
571 1.7549 2.4086 -0.1983 A648 C55 1.2437441 2.2381567 -0.6486543 0.3773178
572 2.0231 -0.7529 -0.9873 A791 C449 1.9465088 -1.1051116 -0.8303191 0.1952612
573 -0.7844 2.4299 -0.8329 A153 C136 -0.3880524 2.3702890 -0.7627342 0.1753748
574 1.1828 -2.7098 0.2911 A868 C781 0.3633024 -2.3921816 0.7580945 0.5347035
575 -0.8695 1.7843 0.9999 A159 C238 -1.2449921 1.9998908 0.7761860 0.2715989
576 -1.1207 -2.7825 0.0230 A439 C881 -0.7910667 -2.6797486 -0.2267249 0.2273699
577 -0.0748 -1.0639 -0.3587 A526 C595 0.2284234 -1.2015087 -0.5962098 0.2261139
578 -1.2428 -2.7287 0.0575 A439 C881 -0.7910667 -2.6797486 -0.2267249 0.2616365
579 2.8513 -0.9218 -0.0828 A883 C175 2.7908779 -0.0658210 0.2694047 0.4228686
580 -1.1757 -1.3169 -0.9721 A395 C734 -1.1324839 -1.3326418 -0.8951142 0.0453146
581 -1.9833 -2.2366 -0.1461 A199 C885 -2.2205445 -1.6962525 -0.2547697 0.2954206
582 -1.0467 0.1224 0.3236 A292 C486 -1.1082059 0.3959832 0.5157773 0.1757555
583 0.0828 2.7407 -0.6705 A239 C136 -0.3880524 2.3702890 -0.7627342 0.3111658
584 2.3507 -1.8360 -0.1850 A896 C586 2.0481682 -1.8887843 0.3179989 0.2861050
585 1.7116 -2.4111 -0.2905 A887 C586 2.0481682 -1.8887843 0.3179989 0.4891276
586 2.7199 0.3429 0.6711 A830 C175 2.7908779 -0.0658210 0.2694047 0.2937981
587 2.5051 0.3796 -0.8456 A788 C143 2.1955674 0.6403446 -0.8206917 0.1983952
588 -1.8943 -1.6710 -0.8505 A302 C885 -2.2205445 -1.6962525 -0.2547697 0.3157424
589 -0.4655 -1.1613 -0.6628 A474 C595 0.2284234 -1.2015087 -0.5962098 0.2669074
590 -2.5056 1.0992 0.6768 A40 C489 -2.3565929 1.0053767 0.6199063 0.0999080
591 1.9829 -1.2689 -0.9352 A834 C449 1.9465088 -1.1051116 -0.8303191 0.1016869
592 -0.5048 -1.7034 0.9747 A534 C796 -1.2416195 -1.6918074 0.8854221 0.2792300
593 1.3969 -0.5298 0.8625 A716 C440 1.6077818 -0.7377123 0.8863730 0.1475557
594 -1.1043 0.3967 -0.5627 A254 C561 -1.4573729 -0.0038881 -0.7758762 0.3222790
595 0.2263 2.9654 -0.2266 A220 C86 0.4002570 2.4565213 0.7038322 0.5377559
596 -0.8658 -1.9585 -0.9900 A451 C734 -1.1324839 -1.3326418 -0.8951142 0.3291426
597 2.1895 -1.9851 -0.2952 A896 C586 2.0481682 -1.8887843 0.3179989 0.2836155
598 -0.0276 -1.5130 0.8735 A551 C584 0.3972782 -1.1889885 0.6190008 0.3344629
599 -1.1164 -1.8174 -0.9911 A402 C734 -1.1324839 -1.3326418 -0.8951142 0.1989426
600 -0.7666 -0.6520 -0.1128 A398 C600 -0.7691625 -0.6921120 0.2002673 0.1185806
601 -1.6434 2.2704 0.5964 A42 C238 -1.2449921 1.9998908 0.7761860 0.2829011
602 0.6884 -2.8245 -0.4208 A797 C749 0.7737695 -2.3217321 -0.7810068 0.3161147
603 -0.2810 -0.9981 -0.2690 A488 C595 0.2284234 -1.2015087 -0.5962098 0.3466806
604 1.0048 1.6324 -0.9965 A498 C55 1.2437441 2.2381567 -0.6486543 0.3975155
605 -1.0620 -2.6369 -0.5384 A442 C881 -0.7910667 -2.6797486 -0.2267249 0.2084857
606 -2.6396 1.0256 -0.5551 A31 C735 -2.6744307 -0.1083882 -0.5495668 0.3914507
607 0.9825 -0.2127 -0.1027 A600 C401 1.0490181 -0.2102674 -0.2660895 0.0774467
608 1.0656 1.0358 -0.8578 A537 C248 0.7051029 1.0093226 -0.5739830 0.2235972
609 -1.2023 0.2876 -0.6455 A253 C561 -1.4573729 -0.0038881 -0.7758762 0.2256457
610 -0.7470 0.8199 0.4544 A284 C486 -1.1082059 0.3959832 0.5157773 0.2821667
611 1.3243 1.8514 0.9611 A536 C43 2.0166149 1.7911539 0.4786294 0.4116772
612 0.7804 1.1002 -0.7590 A479 C248 0.7051029 1.0093226 -0.5739830 0.1170639
613 -1.2257 -1.7894 -0.9856 A368 C734 -1.1324839 -1.3326418 -0.8951142 0.2134867
614 -0.8393 2.5283 -0.7478 A110 C136 -0.3880524 2.3702890 -0.7627342 0.2080643
615 -2.7297 -0.8602 0.5069 A104 C739 -2.2003530 -0.5265358 0.8437657 0.3999590
616 -1.6496 1.3014 0.9949 A123 C238 -1.2449921 1.9998908 0.7761860 0.4406042
617 1.2892 -2.6961 0.1511 A868 C586 2.0481682 -1.8887843 0.3179989 0.5777276
618 2.2028 1.8696 0.4574 A728 C43 2.0166149 1.7911539 0.4786294 0.0952869
619 -0.3538 -1.7106 -0.9674 A524 C734 -1.1324839 -1.3326418 -0.8951142 0.4096426
620 0.8595 -0.7535 0.5153 A626 C584 0.3972782 -1.1889885 0.6190008 0.3338037
621 2.4636 1.4380 0.5227 A780 C43 2.0166149 1.7911539 0.4786294 0.2814032
622 -2.2334 1.8314 -0.4594 A8 C349 -1.9528751 1.6099130 -0.6652502 0.2359540
623 2.5190 -0.5243 0.8196 A845 C440 1.6077818 -0.7377123 0.8863730 0.3971345
624 2.2300 -1.8640 0.4224 A895 C586 2.0481682 -1.8887843 0.3179989 0.1036724
625 -0.8035 -0.7221 -0.3926 A384 C600 -0.7691625 -0.6921120 0.2002673 0.2190643
626 -1.9798 0.5833 0.9980 A117 C489 -2.3565929 1.0053767 0.6199063 0.3923211
627 -1.1517 -1.3108 -0.9669 A395 C734 -1.1324839 -1.3326418 -0.8951142 0.0376146
628 0.5289 1.0591 -0.5778 A448 C248 0.7051029 1.0093226 -0.5739830 0.0765991
629 -0.7507 0.7466 -0.3377 A290 C369 -0.6886751 1.0313763 -0.5793286 0.1961432
630 0.9194 -0.8520 0.6653 A668 C584 0.3972782 -1.1889885 0.6190008 0.3018032
631 -0.9681 -0.6976 0.5909 A357 C600 -0.7691625 -0.6921120 0.2002673 0.1983527
632 -0.5195 1.6644 0.9666 A232 C238 -1.2449921 1.9998908 0.7761860 0.4171323
633 -0.4785 -0.8951 -0.1723 A459 C600 -0.7691625 -0.6921120 0.2002673 0.2887393
634 -0.5117 -0.8666 0.1130 A437 C600 -0.7691625 -0.6921120 0.2002673 0.1730726
635 1.9110 -0.7038 0.9993 A772 C440 1.6077818 -0.7377123 0.8863730 0.1500192
636 -0.6131 1.7582 0.9904 A195 C238 -1.2449921 1.9998908 0.7761860 0.3625989
637 1.1752 -2.1111 -0.9093 A817 C749 0.7737695 -2.3217321 -0.7810068 0.2467853
638 2.3510 0.8604 -0.8640 A743 C143 2.1955674 0.6403446 -0.8206917 0.1395988
639 -0.9985 -2.7753 -0.3139 A515 C881 -0.7910667 -2.6797486 -0.2267249 0.1300533
640 -0.5238 1.7526 -0.9853 A238 C136 -0.3880524 2.3702890 -0.7627342 0.3253342
641 1.3442 -0.1036 -0.7584 A670 C401 1.0490181 -0.2102674 -0.2660895 0.2980533
642 -1.5723 -1.2375 1.0000 A243 C796 -1.2416195 -1.6918074 0.8854221 0.2998553
643 0.8112 1.1124 -0.7820 A507 C248 0.7051029 1.0093226 -0.5739830 0.1390639
644 1.2660 -0.2888 0.7127 A658 C440 1.6077818 -0.7377123 0.8863730 0.3214557
645 2.5982 1.3225 -0.4025 A768 C143 2.1955674 0.6403446 -0.8206917 0.5009932
646 1.6845 1.7248 -0.9116 A655 C55 1.2437441 2.2381567 -0.6486543 0.4056861
647 0.9550 1.3824 0.9475 A522 C241 1.3715588 0.6683398 0.7897889 0.4294434
648 0.2723 -1.5444 -0.9020 A615 C595 0.2284234 -1.2015087 -0.5962098 0.2308527
649 1.9756 -0.1891 -0.9999 A770 C449 1.9465088 -1.1051116 -0.8303191 0.3715613
650 -0.9735 -1.1867 -0.8853 A412 C734 -1.1324839 -1.3326418 -0.8951142 0.1049133
651 -1.3334 2.6861 -0.0478 A35 B7 -1.3872333 2.6403000 -0.1728167 0.0748833
652 1.9285 1.2034 0.9620 A713 C43 2.0166149 1.7911539 0.4786294 0.3864131
653 2.4124 -0.2476 0.9052 A818 C175 2.7908779 -0.0658210 0.2694047 0.3986841
654 2.0804 1.8932 -0.5825 A708 C43 2.0166149 1.7911539 0.4786294 0.4089869
655 -1.2970 2.0146 -0.9182 A108 C349 -1.9528751 1.6099130 -0.6652502 0.4378373
656 -1.5459 -2.0455 0.8258 A287 C796 -1.2416195 -1.6918074 0.8854221 0.2391984
657 -0.3568 1.2439 0.7083 A280 C284 0.0540627 1.1974407 0.5548169 0.2036017
658 -0.8760 -1.6343 0.9893 A433 C796 -1.2416195 -1.6918074 0.8854221 0.1756683
659 -0.5353 1.8551 -0.9976 A203 C136 -0.3880524 2.3702890 -0.7627342 0.2991008
660 0.3546 -0.9397 -0.0934 A540 C595 0.2284234 -1.2015087 -0.5962098 0.2969317
661 -1.6496 -2.4492 0.3033 A315 C881 -0.7910667 -2.6797486 -0.2267249 0.5397023
662 -2.5656 0.1317 0.8224 A81 C739 -2.2003530 -0.5265358 0.8437657 0.3482828
663 1.8608 1.3005 0.9628 A645 C43 2.0166149 1.7911539 0.4786294 0.3768798
664 2.9249 0.3505 0.3248 A841 C175 2.7908779 -0.0658210 0.2694047 0.2019128
665 0.6457 -1.6575 0.9752 A696 C584 0.3972782 -1.1889885 0.6190008 0.3577108
666 0.1368 0.9935 0.0759 A422 C284 0.0540627 1.1974407 0.5548169 0.2551983
667 0.8900 -2.4050 -0.8255 A786 C749 0.7737695 -2.3217321 -0.7810068 0.0813305
668 -1.4920 1.2307 -0.9978 A149 C349 -1.9528751 1.6099130 -0.6652502 0.3908793
669 -1.7534 -0.5223 0.9854 A219 C739 -2.2003530 -0.5265358 0.8437657 0.1976077
670 1.2311 -0.0892 0.6432 A658 C241 1.3715588 0.6683398 0.7897889 0.3481958
671 0.3803 -1.0225 0.4166 A564 C584 0.3972782 -1.1889885 0.6190008 0.1286225
672 1.0541 0.5678 0.5964 A577 C241 1.3715588 0.6683398 0.7897889 0.2037958
673 2.0403 0.1561 -0.9989 A741 C143 2.1955674 0.6403446 -0.8206917 0.2725734
674 0.9612 1.9183 -0.9893 A458 C55 1.2437441 2.2381567 -0.6486543 0.3143488
675 -1.0499 0.1010 0.3262 A292 C486 -1.1082059 0.3959832 0.5157773 0.1809555
676 0.1230 -1.7550 0.9706 A644 C781 0.3633024 -2.3921816 0.7580945 0.3633298
677 -0.6643 1.5223 -0.9408 A208 C369 -0.6886751 1.0313763 -0.5793286 0.2922568
678 0.8273 0.9881 0.7029 A514 C241 1.3715588 0.6683398 0.7897889 0.3169693
679 2.0352 -2.0150 0.5037 A891 C586 2.0481682 -1.8887843 0.3179989 0.1082950
680 -2.6951 -0.4503 0.6808 A77 C739 -2.2003530 -0.5265358 0.8437657 0.2446495
681 2.1908 1.8223 -0.5274 A740 C43 2.0166149 1.7911539 0.4786294 0.4037869
682 -2.4852 -0.2717 -0.8660 A98 C735 -2.6744307 -0.1083882 -0.5495668 0.2229919
683 1.6783 0.0772 -0.9474 A689 C143 2.1955674 0.6403446 -0.8206917 0.4023734
684 0.6254 0.8114 0.2197 A496 C248 0.7051029 1.0093226 -0.5739830 0.3571028
685 -1.5871 0.1607 -0.9144 A216 C561 -1.4573729 -0.0038881 -0.7758762 0.1442797
686 -0.8377 -0.9972 0.7164 A414 C600 -0.7691625 -0.6921120 0.2002673 0.2965861
687 1.1415 1.3608 -0.9746 A548 C248 0.7051029 1.0093226 -0.5739830 0.3961639
688 0.9437 0.4865 0.3457 A543 C241 1.3715588 0.6683398 0.7897889 0.3512625
689 0.0768 1.0318 0.2609 A399 C284 0.0540627 1.1974407 0.5548169 0.1607650
690 -1.4622 -0.0679 -0.8441 A218 C561 -1.4573729 -0.0038881 -0.7758762 0.0456876
691 -1.1216 0.3775 -0.5772 A254 C561 -1.4573729 -0.0038881 -0.7758762 0.3052790
692 -0.9478 2.3429 -0.8496 A132 C136 -0.3880524 2.3702890 -0.7627342 0.2246675
693 -1.0487 -2.0630 -0.9493 A451 C734 -1.1324839 -1.3326418 -0.8951142 0.2894426
694 0.0636 1.0094 -0.1503 A406 C284 0.0540627 1.1974407 0.5548169 0.3008983
695 0.7048 -1.0183 0.6481 A652 C584 0.3972782 -1.1889885 0.6190008 0.1691032
696 1.1708 -1.2149 0.9498 A759 C440 1.6077818 -0.7377123 0.8863730 0.3258655
697 0.4922 0.8973 0.2153 A475 C284 0.0540627 1.1974407 0.5548169 0.3592650
698 0.8516 0.8151 -0.5707 A529 C248 0.7051029 1.0093226 -0.5739830 0.1146676
699 0.6533 -1.2453 0.8047 A672 C584 0.3972782 -1.1889885 0.6190008 0.1660108
700 0.0460 -1.0568 0.3351 A520 C584 0.3972782 -1.1889885 0.6190008 0.2557892
701 -2.7755 0.9662 -0.3443 A16 C735 -2.6744307 -0.1083882 -0.5495668 0.4603081
702 -1.4821 0.3120 0.8743 A221 C486 -1.1082059 0.3959832 0.5157773 0.2721333
703 2.1590 1.9691 0.3870 A728 C43 2.0166149 1.7911539 0.4786294 0.1373202
704 -1.9778 1.6304 0.8263 A82 C238 -1.2449921 1.9998908 0.7761860 0.3841376
705 -2.1780 1.8323 0.5328 A14 C489 -2.3565929 1.0053767 0.6199063 0.3642075
706 -1.4985 -1.9168 -0.9014 A302 C734 -1.1324839 -1.3326418 -0.8951142 0.3188200
707 -1.1966 -0.9639 0.8861 A352 C796 -1.2416195 -1.6918074 0.8854221 0.2578683
708 -0.8079 -0.5996 -0.1102 A377 C600 -0.7691625 -0.6921120 0.2002673 0.1472389
709 -0.6987 -0.7359 0.1709 A403 C600 -0.7691625 -0.6921120 0.2002673 0.0478726
710 0.4899 -1.9888 -0.9988 A673 C749 0.7737695 -2.3217321 -0.7810068 0.2781983
711 1.1551 -2.7246 -0.2822 A831 C749 0.7737695 -2.3217321 -0.7810068 0.4276684
712 1.2965 -1.0996 -0.9539 A732 C449 1.9465088 -1.1051116 -0.8303191 0.2597004
713 -1.2545 1.2382 0.9714 A161 C238 -1.2449921 1.9998908 0.7761860 0.3221376
714 1.1090 -0.6640 -0.7068 A666 C401 1.0490181 -0.2102674 -0.2660895 0.3181417
715 -0.4117 -1.0055 0.4070 A471 C600 -0.7691625 -0.6921120 0.2002673 0.2925277
716 -0.9037 -2.0041 -0.9801 A451 C734 -1.1324839 -1.3326418 -0.8951142 0.3284093
717 -0.2731 1.7720 0.9783 A248 C284 0.0540627 1.1974407 0.5548169 0.4417350
718 -1.3254 -0.4557 -0.8012 A304 C561 -1.4573729 -0.0038881 -0.7758762 0.2030362
719 0.4671 0.9633 -0.3690 A469 C248 0.7051029 1.0093226 -0.5739830 0.1630028
720 -0.6360 1.6464 0.9720 A232 C238 -1.2449921 1.9998908 0.7761860 0.3860989
721 2.8007 1.0313 -0.1752 A807 C175 2.7908779 -0.0658210 0.2694047 0.5171826
722 -0.7577 1.2705 0.8537 A246 C284 0.0540627 1.1974407 0.5548169 0.3945684
723 0.5144 0.9067 0.2883 A470 C284 0.0540627 1.1974407 0.5548169 0.3391983
724 -0.4076 1.0078 0.4081 A333 C284 0.0540627 1.1974407 0.5548169 0.2660067
725 0.9441 -0.3729 0.1730 A602 C401 1.0490181 -0.2102674 -0.2660895 0.2355467
726 -0.5077 -2.1905 -0.9686 A542 C881 -0.7910667 -2.6797486 -0.2267249 0.5048301
727 -0.7651 1.1235 0.7678 A245 C284 0.0540627 1.1974407 0.5548169 0.3686955
728 0.8193 1.1242 -0.7932 A507 C248 0.7051029 1.0093226 -0.5739830 0.1494305
729 1.7522 2.1212 -0.6600 A641 C55 1.2437441 2.2381567 -0.6486543 0.2122528
730 -0.5695 -1.3865 0.8654 A463 C796 -1.2416195 -1.6918074 0.8854221 0.3324830
731 -0.8924 -2.6033 -0.6591 A497 C881 -0.7910667 -2.6797486 -0.2267249 0.2033857
732 -1.1690 2.6181 0.4979 A48 C238 -1.2449921 1.9998908 0.7761860 0.3241624
733 2.5415 0.2926 -0.8296 A788 C143 2.1955674 0.6403446 -0.8206917 0.2341952
734 0.5176 -2.8669 0.4074 A809 C781 0.3633024 -2.3921816 0.7580945 0.3265702
735 2.4278 0.9802 -0.7860 A765 C143 2.1955674 0.6403446 -0.8206917 0.2022599
736 -0.6331 1.8907 -1.0000 A203 C136 -0.3880524 2.3702890 -0.7627342 0.3206342
737 -0.2547 -0.9730 -0.1073 A484 C600 -0.7691625 -0.6921120 0.2002673 0.3676393
738 0.4394 2.7139 0.6622 A327 C86 0.4002570 2.4565213 0.7038322 0.1127179
739 -0.9718 -2.5446 -0.6899 A497 C881 -0.7910667 -2.6797486 -0.2267249 0.2596857
740 -1.2822 -2.0245 -0.9181 A402 C734 -1.1324839 -1.3326418 -0.8951142 0.2881867
741 0.3715 1.0557 0.4735 A431 C284 0.0540627 1.1974407 0.5548169 0.1801650
742 -1.0551 2.4294 -0.7611 A110 C136 -0.3880524 2.3702890 -0.7627342 0.2425976
743 -1.8517 2.2335 -0.4333 A24 C349 -1.9528751 1.6099130 -0.6652502 0.3189041
744 -2.0391 1.1568 0.9388 A97 C489 -2.3565929 1.0053767 0.6199063 0.2626033
745 1.7183 0.4751 0.9761 A723 C241 1.3715588 0.6683398 0.7897889 0.2420973
746 2.6833 1.1678 -0.3765 A795 C143 2.1955674 0.6403446 -0.8206917 0.4864599
747 -2.2414 0.6297 -0.9446 A86 C349 -1.9528751 1.6099130 -0.6652502 0.5160292
748 -0.9681 -0.6112 -0.5184 A355 C600 -0.7691625 -0.6921120 0.2002673 0.3328389
749 0.3047 0.9525 -0.0106 A428 C248 0.7051029 1.0093226 -0.5739830 0.3402028
750 -1.4524 2.4939 0.4636 A52 C238 -1.2449921 1.9998908 0.7761860 0.3380011
751 0.4293 -0.9396 -0.2549 A583 C595 0.2284234 -1.2015087 -0.5962098 0.2680317
752 0.1198 -2.4846 0.8731 A718 C781 0.3633024 -2.3921816 0.7580945 0.1503088
753 -0.2947 0.9688 0.1587 A358 C284 0.0540627 1.1974407 0.5548169 0.3245067
754 0.3113 2.8561 -0.4878 A220 C136 -0.3880524 2.3702890 -0.7627342 0.4866992
755 0.2665 -1.7114 0.9634 A675 C781 0.3633024 -2.3921816 0.7580945 0.3276298
756 1.8999 -0.1016 -0.9952 A729 C143 2.1955674 0.6403446 -0.8206917 0.4040401
757 2.4412 -1.7303 -0.1241 A894 C586 2.0481682 -1.8887843 0.3179989 0.3312050
758 0.5551 2.3177 -0.9237 A396 C55 1.2437441 2.2381567 -0.6486543 0.3477444
759 -1.4341 -0.7982 -0.9334 A314 C734 -1.1324839 -1.3326418 -0.8951142 0.2914479
760 2.7886 0.5819 -0.5289 A816 C143 2.1955674 0.6403446 -0.8206917 0.3144230
761 -0.5464 2.5419 0.8000 A160 C86 0.4002570 2.4565213 0.7038322 0.3760678
762 -1.0242 0.4211 0.4507 A276 C486 -1.1082059 0.3959832 0.5157773 0.0580667
763 -1.0319 0.1283 -0.2795 A312 C561 -1.4573729 -0.0038881 -0.7758762 0.3513457
764 -1.5272 -2.5220 -0.3173 A349 C881 -0.7910667 -2.6797486 -0.2267249 0.3281523
765 1.5020 -0.5839 -0.9214 A714 C449 1.9465088 -1.1051116 -0.8303191 0.3522671
766 -0.4849 -0.9703 -0.4028 A446 C595 0.2284234 -1.2015087 -0.5962098 0.3793139
767 -1.4488 1.1390 0.9876 A147 C238 -1.2449921 1.9998908 0.7761860 0.4253709
768 -0.5219 -2.7379 -0.6167 A554 C881 -0.7910667 -2.6797486 -0.2267249 0.2390977
769 -2.2479 1.5496 0.6832 A44 C489 -2.3565929 1.0053767 0.6199063 0.2387366
770 -0.1817 -1.5395 -0.8931 A546 C595 0.2284234 -1.2015087 -0.5962098 0.3483350
771 -1.4958 -0.1435 -0.8676 A218 C561 -1.4573729 -0.0038881 -0.7758762 0.0899210
772 0.6658 2.9187 -0.1127 A408 C86 0.4002570 2.4565213 0.7038322 0.5147513
773 -0.5298 -1.0959 0.6223 A465 C584 0.3972782 -1.1889885 0.6190008 0.3411553
774 0.3841 -0.9252 -0.0589 A567 C584 0.3972782 -1.1889885 0.6190008 0.3182892
775 -0.7664 1.1051 -0.7555 A267 C369 -0.6886751 1.0313763 -0.5793286 0.1092067
776 -1.0188 0.9669 -0.8034 A226 C369 -0.6886751 1.0313763 -0.5793286 0.2062242
777 -2.5464 -0.2417 -0.8300 A98 C735 -2.6744307 -0.1083882 -0.5495668 0.1805919
778 -2.0807 0.5922 -0.9866 A139 C561 -1.4573729 -0.0038881 -0.7758762 0.4767130
779 -1.1831 -1.8785 0.9755 A432 C796 -1.2416195 -1.6918074 0.8854221 0.1117634
780 -0.1901 2.5961 -0.7977 A187 C136 -0.3880524 2.3702890 -0.7627342 0.1529097
781 -1.4678 -1.2009 0.9946 A320 C796 -1.2416195 -1.6918074 0.8854221 0.2754219
782 0.4993 1.8356 -0.9952 A378 C248 0.7051029 1.0093226 -0.5739830 0.4844324
783 -1.6490 -2.4871 0.1774 A315 C881 -0.7910667 -2.6797486 -0.2267249 0.4849023
784 -1.9702 -0.4788 -0.9996 A178 C561 -1.4573729 -0.0038881 -0.7758762 0.4038210
785 2.6951 -1.3175 -0.0135 A893 C586 2.0481682 -1.8887843 0.3179989 0.5165716
786 -1.1411 -2.2151 -0.8707 A457 C734 -1.1324839 -1.3326418 -0.8951142 0.3051629
787 -0.5181 0.9490 -0.3949 A305 C369 -0.6886751 1.0313763 -0.5793286 0.1457933
788 0.2645 -1.7999 0.9835 A675 C781 0.3633024 -2.3921816 0.7580945 0.3054965
789 -2.1224 -1.3869 0.8446 A189 C739 -2.2003530 -0.5265358 0.8437657 0.3130505
790 -0.0835 1.0285 -0.2503 A385 C369 -0.6886751 1.0313763 -0.5793286 0.3123600
791 0.2964 -1.0301 0.3722 A564 C584 0.3972782 -1.1889885 0.6190008 0.1688558
792 -2.9697 -0.0636 -0.2415 A47 C735 -2.6744307 -0.1083882 -0.5495668 0.2160415
793 1.5766 -1.2649 -0.9998 A760 C449 1.9465088 -1.1051116 -0.8303191 0.2330594
794 1.2197 -0.6145 -0.7732 A693 C401 1.0490181 -0.2102674 -0.2660895 0.3606750
795 1.1567 2.6821 -0.3899 A473 C55 1.2437441 2.2381567 -0.6486543 0.2632472
796 -1.1813 1.0099 0.8951 A196 C486 -1.1082059 0.3959832 0.5157773 0.3554445
797 1.0089 -0.1229 0.1800 A601 C401 1.0490181 -0.2102674 -0.2660895 0.1911917
798 -1.5521 1.9272 0.8803 A107 C238 -1.2449921 1.9998908 0.7761860 0.1613042
799 -0.9142 0.5823 -0.4009 A270 C369 -0.6886751 1.0313763 -0.5793286 0.2843432
800 -0.9733 -1.6410 0.9958 A433 C796 -1.2416195 -1.6918074 0.8854221 0.1431683
801 0.6134 -0.9412 -0.4813 A614 C595 0.2284234 -1.2015087 -0.5962098 0.2533984
802 -0.4919 -1.6794 0.9682 A534 C796 -1.2416195 -1.6918074 0.8854221 0.2816349
803 1.9416 1.1083 -0.9718 A707 C143 2.1955674 0.6403446 -0.8206917 0.2910104
804 -1.6350 1.4454 0.9833 A119 C238 -1.2449921 1.9998908 0.7761860 0.3838709
805 -0.7363 1.3667 0.8942 A246 C238 -1.2449921 1.9998908 0.7761860 0.4199656
806 -0.1194 1.0021 0.1349 A381 C284 0.0540627 1.1974407 0.5548169 0.2629067
807 0.0281 -2.8405 -0.5416 A738 C881 -0.7910667 -2.6797486 -0.2267249 0.4315977
808 -1.9791 -0.1920 0.9999 A164 C739 -2.2003530 -0.5265358 0.8437657 0.2373077
809 -0.7417 -1.5029 0.9460 A463 C796 -1.2416195 -1.6918074 0.8854221 0.2498016
810 1.6173 1.3443 0.9947 A645 C241 1.3715588 0.6683398 0.7897889 0.3755375
811 1.0458 -2.8054 -0.1094 A839 B16 1.2100800 -2.7380200 -0.0992800 0.0805933
812 2.1720 -1.9280 -0.4270 A896 C586 2.0481682 -1.8887843 0.3179989 0.3026821
813 2.1857 0.1329 0.9818 A767 C175 2.7908779 -0.0658210 0.2694047 0.5054314
814 1.9111 1.0016 0.9875 A715 C241 1.3715588 0.6683398 0.7897889 0.3568375
815 0.8099 1.2015 -0.8345 A502 C248 0.7051029 1.0093226 -0.5739830 0.1858305
816 1.2329 2.4298 0.6890 A528 C86 0.4002570 2.4565213 0.7038322 0.2913988
817 0.6089 0.7987 -0.0926 A476 C248 0.7051029 1.0093226 -0.5739830 0.2627361
818 0.1751 1.9570 0.9994 A354 C86 0.4002570 2.4565213 0.7038322 0.3400821
819 1.6527 -2.1793 0.6780 A860 C586 2.0481682 -1.8887843 0.3179989 0.3486617
820 -1.5662 -0.1200 0.9032 A202 C739 -2.2003530 -0.5265358 0.8437657 0.3667077
821 -0.0892 1.0882 -0.4187 A385 C369 -0.6886751 1.0313763 -0.5793286 0.2723091
822 -2.5678 -1.5251 -0.1635 A126 C885 -2.2205445 -1.6962525 -0.2547697 0.2032259
823 -1.3904 -0.9561 -0.9499 A314 C734 -1.1324839 -1.3326418 -0.8951142 0.2297479
824 2.3203 -1.9015 0.0079 A900 C586 2.0481682 -1.8887843 0.3179989 0.1983155
825 -0.6710 -0.8114 -0.3208 A411 C600 -0.7691625 -0.6921120 0.2002673 0.2461726
826 1.2089 2.2402 0.8380 A493 C86 0.4002570 2.4565213 0.7038322 0.3863774
827 1.1246 -1.7406 0.9974 A777 C440 1.6077818 -0.7377123 0.8863730 0.5323655
828 -0.1536 -2.3391 -0.9389 A635 C749 0.7737695 -2.3217321 -0.7810068 0.3675435
829 1.8496 -0.3184 0.9924 A779 C440 1.6077818 -0.7377123 0.8863730 0.2557192
830 -0.7143 2.2384 -0.9369 A153 C136 -0.3880524 2.3702890 -0.7627342 0.2107675
831 1.8060 -1.3900 -0.9603 A805 C449 1.9465088 -1.1051116 -0.8303191 0.1851260
832 1.8971 0.2357 -0.9961 A741 C143 2.1955674 0.6403446 -0.8206917 0.2928401
833 -2.8836 0.5040 0.3743 A28 C489 -2.3565929 1.0053767 0.6199063 0.4246634
834 0.9361 1.9440 0.9875 A449 C86 0.4002570 2.4565213 0.7038322 0.4440107
835 2.9772 0.3597 -0.0489 A837 C175 2.7908779 -0.0658210 0.2694047 0.3100493
836 -0.3690 0.9739 -0.2851 A347 C369 -0.6886751 1.0313763 -0.5793286 0.2237933
837 -0.3794 -0.9701 -0.2857 A459 C595 0.2284234 -1.2015087 -0.5962098 0.3832473
838 -1.3547 -2.5256 -0.5001 A382 C881 -0.7910667 -2.6797486 -0.2267249 0.3303857
839 -2.3754 -1.3752 0.6673 A150 C739 -2.2003530 -0.5265358 0.8437657 0.4000590
840 2.5685 0.2401 0.8149 A802 C175 2.7908779 -0.0658210 0.2694047 0.3579314
841 -2.4370 -1.3451 -0.6212 A165 C885 -2.2205445 -1.6962525 -0.2547697 0.3113461
842 0.1460 1.1172 -0.4871 A410 C248 0.7051029 1.0093226 -0.5739830 0.2512878
843 -0.4237 1.1652 -0.6497 A343 C369 -0.6886751 1.0313763 -0.5793286 0.1563901
844 0.9056 0.5188 -0.2924 A527 C401 1.0490181 -0.2102674 -0.2660895 0.2995987
845 -2.0585 -0.7940 0.9785 A155 C739 -2.2003530 -0.5265358 0.8437657 0.1813505
846 -1.0599 -2.5859 -0.6070 A442 C881 -0.7910667 -2.6797486 -0.2267249 0.2476523
847 0.9710 -2.6713 0.5390 A851 C781 0.3633024 -2.3921816 0.7580945 0.3686369
848 1.4699 -0.3638 -0.8741 A687 C401 1.0490181 -0.2102674 -0.2660895 0.3941417
849 1.0251 0.4942 -0.5069 A550 C248 0.7051029 1.0093226 -0.5739830 0.3007342
850 -2.5113 1.6410 0.0151 A2 C349 -1.9528751 1.6099130 -0.6652502 0.4232874
851 -2.8765 0.8519 -0.0099 A20 C489 -2.3565929 1.0053767 0.6199063 0.4343967
852 -0.8972 2.8079 0.3188 A70 B6 -1.0433500 2.7931200 0.1652600 0.1048233
853 0.5761 -1.5471 0.9371 A696 C584 0.3972782 -1.1889885 0.6190008 0.2850108
854 1.8111 1.8984 0.7816 A667 C43 2.0166149 1.7911539 0.4786294 0.2052439
855 -2.4783 -1.6828 -0.0936 A126 C885 -2.2205445 -1.6962525 -0.2547697 0.1441259
856 -0.6546 1.3814 -0.8819 A230 C369 -0.6886751 1.0313763 -0.5793286 0.2288901
857 -1.5165 0.5741 -0.9256 A185 C561 -1.4573729 -0.0038881 -0.7758762 0.2622797
858 1.1226 1.9108 -0.9764 A510 C55 1.2437441 2.2381567 -0.6486543 0.2587488
859 2.5162 1.3555 -0.5134 A768 C143 2.1955674 0.6403446 -0.8206917 0.4476932
860 1.1841 1.8622 0.9784 A536 C43 2.0166149 1.7911539 0.4786294 0.4677772
861 -0.9887 -0.3821 -0.3410 A335 C600 -0.7691625 -0.6921120 0.2002673 0.3569389
862 -1.4485 0.6638 0.9136 A181 C486 -1.1082059 0.3959832 0.5157773 0.3353112
863 -0.5073 2.9558 0.0443 A120 B5 -0.5508600 2.9469000 -0.0122000 0.0363200
864 -0.4683 0.9760 -0.3979 A347 C369 -0.6886751 1.0313763 -0.5793286 0.1523933
865 1.2372 -0.6451 -0.7964 A693 C401 1.0490181 -0.2102674 -0.2660895 0.3844417
866 -0.1517 -1.0537 0.3534 A506 C584 0.3972782 -1.1889885 0.6190008 0.3166225
867 -1.5589 1.1647 0.9985 A147 C489 -2.3565929 1.0053767 0.6199063 0.4452033
868 -0.9122 0.8726 0.6752 A261 C486 -1.1082059 0.3959832 0.5157773 0.2773485
869 -2.6813 -0.3594 0.7089 A77 C739 -2.2003530 -0.5265358 0.8437657 0.2609828
870 -0.3608 -1.0169 0.3896 A477 C600 -0.7691625 -0.6921120 0.2002673 0.3074944
871 -0.5846 -0.8115 0.0195 A420 C600 -0.7691625 -0.6921120 0.2002673 0.1615726
872 0.7685 -1.0250 0.6951 A652 C584 0.3972782 -1.1889885 0.6190008 0.2037698
873 2.3031 -0.1196 0.9520 A818 C175 2.7908779 -0.0658210 0.2694047 0.4080507
874 -2.2813 1.7749 0.4552 A9 C489 -2.3565929 1.0053767 0.6199063 0.3365075
875 1.0290 -2.4125 -0.7824 A827 C749 0.7737695 -2.3217321 -0.7810068 0.1157972
876 1.7745 -0.3419 -0.9812 A750 C449 1.9465088 -1.1051116 -0.8303191 0.3620338
877 -1.9880 0.3898 0.9997 A130 C739 -2.2003530 -0.5265358 0.8437657 0.4282077
878 -2.3828 -1.6542 -0.4344 A168 C885 -2.2205445 -1.6962525 -0.2547697 0.1279794
879 1.3378 0.6163 0.8498 A612 C241 1.3715588 0.6683398 0.7897889 0.0486032
880 0.4932 2.7237 -0.6404 A356 C55 1.2437441 2.2381567 -0.6486543 0.4147806
881 -0.7942 2.5691 -0.7247 A110 C136 -0.3880524 2.3702890 -0.7627342 0.2143309
882 0.9089 2.6941 0.5374 A482 C86 0.4002570 2.4565213 0.7038322 0.3042179
883 2.3570 0.8478 0.8632 A745 C241 1.3715588 0.6683398 0.7897889 0.4127708
884 -1.4659 2.5197 -0.4033 A43 C136 -0.3880524 2.3702890 -0.7627342 0.5288976
885 1.5652 -0.0004 0.9005 A706 C440 1.6077818 -0.7377123 0.8863730 0.2646737
886 2.0064 0.2239 -0.9998 A741 C143 2.1955674 0.6403446 -0.8206917 0.2615734
887 -0.0070 2.9967 -0.0816 A175 C136 -0.3880524 2.3702890 -0.7627342 0.5628658
888 0.0647 1.1972 -0.5986 A373 C248 0.7051029 1.0093226 -0.5739830 0.2842991
889 0.7030 -0.9188 -0.5378 A614 C595 0.2284234 -1.2015087 -0.5962098 0.2718984
890 -0.6098 1.1700 0.7326 A278 C284 0.0540627 1.1974407 0.5548169 0.2896955
891 0.8216 -0.6134 0.2238 A610 C401 1.0490181 -0.2102674 -0.2660895 0.3734801
892 -0.3041 -1.6957 -0.9608 A524 C734 -1.1324839 -1.3326418 -0.8951142 0.4190426
893 -0.1743 -1.2538 -0.6790 A521 C595 0.2284234 -1.2015087 -0.5962098 0.1792683
894 -0.7165 -1.7330 0.9922 A433 C796 -1.2416195 -1.6918074 0.8854221 0.2243634
895 -2.2476 -1.9798 -0.0977 A176 C885 -2.2205445 -1.6962525 -0.2547697 0.1558909
896 1.3934 0.7061 0.8990 A612 C241 1.3715588 0.6683398 0.7897889 0.0562708
897 2.8290 0.3350 -0.5288 A816 C143 2.1955674 0.6403446 -0.8206917 0.4102230
898 1.3089 1.6033 0.9976 A553 C241 1.3715588 0.6683398 0.7897889 0.4018100
899 -0.2807 -2.9525 0.2592 A691 C881 -0.7910667 -2.6797486 -0.2267249 0.4230143
900 1.4220 -1.9844 0.8974 A833 C586 2.0481682 -1.8887843 0.3179989 0.4337284
901 -0.3375 1.2580 -0.7166 A343 C369 -0.6886751 1.0313763 -0.5793286 0.2383568
902 -2.4900 0.1258 -0.8699 A89 C735 -2.6744307 -0.1083882 -0.5495668 0.2463174
903 2.9322 -0.4674 0.2461 A872 C175 2.7908779 -0.0658210 0.2694047 0.1887353
904 -2.1095 -1.0556 -0.9334 A192 C734 -1.1324839 -1.3326418 -0.8951142 0.4307812
905 1.3612 -2.6563 0.1741 A868 C586 2.0481682 -1.8887843 0.3179989 0.5327943
906 -2.1721 0.7032 0.9591 A92 C489 -2.3565929 1.0053767 0.6199063 0.2752877
907 1.1076 -2.5960 0.5690 A851 C781 0.3633024 -2.3921816 0.7580945 0.3790702
908 0.5589 1.2429 0.7707 A464 C284 0.0540627 1.1974407 0.5548169 0.2553933
909 1.6700 1.7644 0.9031 A664 C43 2.0166149 1.7911539 0.4786294 0.2659465
910 1.9226 0.0066 0.9970 A773 C440 1.6077818 -0.7377123 0.8863730 0.3899192
911 -0.5962 -1.4437 0.8990 A463 C796 -1.2416195 -1.6918074 0.8854221 0.3023683
912 1.1922 2.7029 0.2992 A472 C86 0.4002570 2.4565213 0.7038322 0.4809846
913 1.0478 -0.5227 0.5591 A640 C440 1.6077818 -0.7377123 0.8863730 0.3674223
914 1.3979 -1.4941 -0.9989 A751 C449 1.9465088 -1.1051116 -0.8303191 0.3687260
915 2.2293 1.4303 -0.7611 A725 C143 2.1955674 0.6403446 -0.8206917 0.2944266
916 2.4352 -0.7026 -0.8452 A824 C449 1.9465088 -1.1051116 -0.8303191 0.3020279
917 -0.7975 -0.6624 0.2685 A383 C600 -0.7691625 -0.6921120 0.2002673 0.0420940
918 -2.2584 1.1719 0.8389 A67 C489 -2.3565929 1.0053767 0.6199063 0.1612366
919 -1.5832 -1.3228 0.9980 A243 C796 -1.2416195 -1.6918074 0.8854221 0.2743886
920 -2.0306 -2.2065 0.0511 A176 C885 -2.2205445 -1.6962525 -0.2547697 0.3353539
921 -0.5566 -1.1836 -0.7218 A474 C734 -1.1324839 -1.3326418 -0.8951142 0.2994133
922 1.9243 -1.9852 -0.6443 A864 C449 1.9465088 -1.1051116 -0.8303191 0.3627721
923 -0.5890 2.4180 0.8725 A160 C238 -1.2449921 1.9998908 0.7761860 0.3901384
924 -0.2397 1.0353 -0.3485 A363 C369 -0.6886751 1.0313763 -0.5793286 0.2279091
925 -1.1654 0.0131 0.5510 A277 C486 -1.1082059 0.3959832 0.5157773 0.1584333
926 -0.0025 1.3894 0.7920 A338 C284 0.0540627 1.1974407 0.5548169 0.1619017
927 2.0881 0.5562 0.9870 A752 C241 1.3715588 0.6683398 0.7897889 0.3419640
928 2.4462 -1.5699 -0.4219 A880 C449 1.9465088 -1.1051116 -0.8303191 0.4576329
929 -1.4430 0.4811 -0.8779 A212 C561 -1.4573729 -0.0038881 -0.7758762 0.2004616
930 -0.0901 0.9965 -0.0349 A387 C284 0.0540627 1.1974407 0.5548169 0.3116067
931 -0.0253 -2.7755 -0.6312 A738 C881 -0.7910667 -2.6797486 -0.2267249 0.4219977
932 2.5360 0.9528 0.7051 A763 C241 1.3715588 0.6683398 0.7897889 0.5111968
933 0.9584 -1.8563 0.9960 A747 C781 0.3633024 -2.3921816 0.7580945 0.4562949
934 -2.5089 -1.0376 0.6991 A143 C739 -2.2003530 -0.5265358 0.8437657 0.3214256
935 -0.9541 2.6986 0.5064 A94 C238 -1.2449921 1.9998908 0.7761860 0.4197958
936 0.9970 2.2589 -0.8831 A440 C55 1.2437441 2.2381567 -0.6486543 0.1673110
937 1.9058 0.1549 0.9961 A746 C241 1.3715588 0.6683398 0.7897889 0.4179973
938 2.2268 0.7466 -0.9373 A769 C143 2.1955674 0.6403446 -0.8206917 0.0846988
939 2.7736 1.1191 0.1346 A806 B9 2.8198727 1.0084818 0.0784909 0.0710000
940 2.6299 -0.9052 -0.6242 A867 C449 1.9465088 -1.1051116 -0.8303191 0.3631406
941 -1.6305 -1.8828 -0.8713 A302 C734 -1.1324839 -1.3326418 -0.8951142 0.3573295
942 -1.5360 0.0745 -0.8868 A216 C561 -1.4573729 -0.0038881 -0.7758762 0.0893130
943 -1.3707 1.1199 -0.9732 A149 C369 -0.6886751 1.0313763 -0.5793286 0.3881400
944 -1.5415 -1.6250 -0.9708 A324 C734 -1.1324839 -1.3326418 -0.8951142 0.2590200
945 0.2870 0.9579 0.0037 A422 C284 0.0540627 1.1974407 0.5548169 0.3411983
946 0.8914 -0.6562 0.4499 A624 C584 0.3972782 -1.1889885 0.6190008 0.3986704
947 -2.2250 -1.3562 0.7957 A189 C739 -2.2003530 -0.5265358 0.8437657 0.3007923
948 -0.2793 -0.9638 0.0829 A490 C600 -0.7691625 -0.6921120 0.2002673 0.2929726
949 0.6901 0.7277 -0.0766 A499 C248 0.7051029 1.0093226 -0.5739830 0.2646695
950 -0.9419 1.7507 -0.9999 A170 C369 -0.6886751 1.0313763 -0.5793286 0.4643733
951 0.3714 -1.1842 0.6511 A585 C584 0.3972782 -1.1889885 0.6190008 0.0209219
952 1.3985 -2.6531 -0.0424 A870 B17 1.4171714 -2.6314714 -0.1032857 0.0337286
953 -0.6511 2.4202 0.8624 A128 C238 -1.2449921 1.9998908 0.7761860 0.3668051
954 -1.7156 -0.9813 -0.9997 A256 C734 -1.1324839 -1.3326418 -0.8951142 0.3463479
955 -0.0321 2.6715 -0.7409 A210 C136 -0.3880524 2.3702890 -0.7627342 0.2263325
956 -1.4324 -2.5175 0.4430 A393 C796 -1.2416195 -1.6918074 0.8854221 0.4862984
957 0.8342 0.9826 0.7032 A514 C241 1.3715588 0.6683398 0.7897889 0.3127360
958 0.7574 2.7446 0.5313 A389 C86 0.4002570 2.4565213 0.7038322 0.2725846
959 0.9689 -0.2485 -0.0222 A602 C401 1.0490181 -0.2102674 -0.2660895 0.1207467
960 2.0099 2.0956 0.4282 A728 C43 2.0166149 1.7911539 0.4786294 0.1205301
961 -1.0850 1.2580 -0.9409 A226 C369 -0.6886751 1.0313763 -0.5793286 0.3281733
962 -0.4320 0.9920 -0.3965 A347 C369 -0.6886751 1.0313763 -0.5793286 0.1596267
963 -0.3248 1.7327 0.9715 A248 C284 0.0540627 1.1974407 0.5548169 0.4436017
964 0.8015 0.7195 -0.3850 A505 C248 0.7051029 1.0093226 -0.5739830 0.1917342
965 1.6184 1.1060 -0.9992 A605 C143 2.1955674 0.6403446 -0.8206917 0.4071104
966 1.9549 0.6513 0.9982 A723 C241 1.3715588 0.6683398 0.7897889 0.2695973
967 0.6016 1.9796 -0.9976 A396 C55 1.2437441 2.2381567 -0.6486543 0.4165488
968 0.8588 -1.1465 -0.8234 A654 C595 0.2284234 -1.2015087 -0.5962098 0.3041918
969 0.4688 1.6845 0.9679 A388 C86 0.4002570 2.4565213 0.7038322 0.3682107
970 -0.1408 -2.7943 -0.6028 A738 C881 -0.7910667 -2.6797486 -0.2267249 0.3802977
971 -1.1176 -2.0523 0.9415 A461 C796 -1.2416195 -1.6918074 0.8854221 0.1801967
972 -1.6772 1.5848 -0.9515 A116 C349 -1.9528751 1.6099130 -0.6652502 0.1956793
973 0.7082 2.4041 0.8624 A427 C86 0.4002570 2.4565213 0.7038322 0.1729774
974 1.3357 -2.4099 -0.6553 A827 C749 0.7737695 -2.3217321 -0.7810068 0.2586017
975 -0.2397 -1.6921 0.9567 A579 C796 -1.2416195 -1.6918074 0.8854221 0.3578300
976 -1.6549 -2.0691 0.7604 A287 C796 -1.2416195 -1.6918074 0.8854221 0.3051984
977 1.1053 -0.4494 -0.5907 A651 C401 1.0490181 -0.2102674 -0.2660895 0.2066750
978 -2.2011 -2.0122 0.1873 A176 C885 -2.2205445 -1.6962525 -0.2547697 0.2591539
979 0.4045 0.9328 -0.1822 A455 C248 0.7051029 1.0093226 -0.5739830 0.2563028
980 0.6069 -1.0242 -0.5871 A614 C595 0.2284234 -1.2015087 -0.5962098 0.1882984
981 -0.1892 1.5526 -0.9000 A321 C136 -0.3880524 2.3702890 -0.7627342 0.3846024
982 -1.1876 0.2951 -0.6304 A253 C561 -1.4573729 -0.0038881 -0.7758762 0.2380790
983 1.8731 0.7602 -0.9998 A707 C143 2.1955674 0.6403446 -0.8206917 0.2071437
984 0.6882 -0.7364 0.1255 A593 C584 0.3972782 -1.1889885 0.6190008 0.4123371
985 -1.1230 -0.0157 0.4807 A293 C486 -1.1082059 0.3959832 0.5157773 0.1538515
986 -1.3121 -0.3849 0.7745 A262 C739 -2.2003530 -0.5265358 0.8437657 0.3663848
987 1.3701 -0.8981 -0.9323 A717 C449 1.9465088 -1.1051116 -0.8303191 0.2951338
988 0.0325 -2.3530 0.9355 A718 C781 0.3633024 -2.3921816 0.7580945 0.1824631
989 -0.8790 1.0695 -0.7880 A226 C369 -0.6886751 1.0313763 -0.5793286 0.1457067
990 -1.4301 -1.2990 -0.9977 A324 C734 -1.1324839 -1.3326418 -0.8951142 0.1446146
991 -1.2881 0.0288 0.7026 A249 C486 -1.1082059 0.3959832 0.5157773 0.2446333
992 0.4156 -1.3423 0.8038 A634 C584 0.3972782 -1.1889885 0.6190008 0.1188108
993 0.5121 0.8629 0.0822 A475 C248 0.7051029 1.0093226 -0.5739830 0.3318695
994 -1.9841 -0.3018 -1.0000 A178 C561 -1.4573729 -0.0038881 -0.7758762 0.3495876
995 -0.9187 -0.6281 0.4614 A383 C600 -0.7691625 -0.6921120 0.2002673 0.1582274
996 1.0940 2.7751 -0.1841 A473 C55 1.2437441 2.2381567 -0.6486543 0.3837472
997 -0.7558 -0.7225 0.2987 A418 C600 -0.7691625 -0.6921120 0.2002673 0.0473944
998 2.3255 0.8772 0.8743 A745 C241 1.3715588 0.6683398 0.7897889 0.4157708
999 -2.6395 -0.2707 -0.7571 A98 C735 -2.6744307 -0.1083882 -0.5495668 0.1349252
1000 -0.5966 -1.7935 0.9939 A534 C796 -1.2416195 -1.6918074 0.8854221 0.2850634
hist(act_pred$diff, breaks = 30, col = "blue", main = "Mean Absolute Difference", xlab = "Difference")

Figure 22: Mean Absolute Difference

8. Executive Summary

9. References

  1. Topology Preserving Maps : https://users.ics.aalto.fi/jhollmen/dippa/node9.html

  2. Vector Quantization : https://en.wikipedia.org/wiki/Vector_quantization

  3. K-means : https://en.wikipedia.org/wiki/K-means_clustering

  4. Sammon’s Projection : https://en.wikipedia.org/wiki/Sammon_mapping

  5. Voronoi Tessellations : https://en.wikipedia.org/wiki/Centroidal_Voronoi_tessellation